stebalien changed the topic of #ipfs to: Heads Up: To talk, you need to register your nick! Announcements: go-ipfs 0.4.18 and js-ipfs 0.33 are out! Get them from dist.ipfs.io and npm respectively! | Also: #libp2p #ipfs-cluster #filecoin #ipfs-dev | IPFS, the InterPlanetary FileSystem: https://github.com/ipfs/ipfs | Logs: https://view.matrix.org/room/!yhqiEdqNjyPbxtUjzm:matrix.org/ | Forums: https://discuss.ipfs.io | Code of Con
shippy has joined #ipfs
sammacbeth has quit [Ping timeout: 245 seconds]
mischat has joined #ipfs
thomasanderson has quit [Remote host closed the connection]
mischat has quit [Ping timeout: 250 seconds]
thomasanderson has joined #ipfs
mischat has joined #ipfs
sammacbeth has joined #ipfs
sammacbeth has quit [Ping timeout: 250 seconds]
hph^ has quit []
Rboreal_Frippery has joined #ipfs
phs^ has joined #ipfs
sammacbeth has joined #ipfs
djdv has quit [Quit: brb]
djdv has joined #ipfs
sammacbeth has quit [Ping timeout: 244 seconds]
<DarkDrgn2k[m]> @swedneck: why ?
<Swedneck> why what?
anacrolix has quit [Quit: Connection closed for inactivity]
woss_io has quit [Ping timeout: 240 seconds]
<lord|> Swedneck: what's needed
<DarkDrgn2k[m]> what makes them unusable?
Rboreal_Frippery has quit [Ping timeout: 244 seconds]
<Swedneck> ah
<Swedneck> well hardbin is too focused on privacy, which makes it needlessly difficult to quickly paste something and share the url/hash
<lord|> yeah it requires javascript which is annoying
<lord|> but no way getting around that problem
<lord|> without a centralized backend
<Swedneck> ipfsbin i can't get to work, i presume i have to do some npm magic
<Swedneck> i mean that it encrypts text
<DarkDrgn2k[m]> its cause js-ipfs doenst support dht :/
<lord|> obviously
<Swedneck> that's just completely pointless for sharing text snippets in public chats
<DarkDrgn2k[m]> and i guess that means you cant pin stuff
Steverman has quit [Ping timeout: 246 seconds]
<Swedneck> i'm perfectly willing to let people pin it on my gateway node
<Swedneck> i just want to restrict it to text only
<lord|> DarkDrgn2k[m]: there is a pin API for the gateway
<lord|> has to be enabled manually
<Swedneck> i don't have any issue with js
<lord|> or just use browser extensions
<lord|> for ipfs redirection
<lord|> and setup pinning locally
<lord|> Swedneck: what's wrong with the URL hash being used anyways?
<Swedneck> i don't want this for myself
<Swedneck> i want it as a public service
<Swedneck> nothing
<lord|> well, the URL spec has a convenient way to do that
<lord|> with URL hashes
<Swedneck> but with hardbin you cannot simply copy the hash, you need a decryption key
<lord|> no, you just copy a single URL lol
<Swedneck> yes, which is not what i want
<Swedneck> i know you can make a url by just using a gateway and /ipfs/Qm..
<fiatjaf> the decryption key is in the url
<Swedneck> my problem with hardbin is that it encrypts the text, and thus requires the decryption key in the URL
<fiatjaf> but using ipfs for sharing stuff that you'll no longer need in 5 minutes is pointless
<lord|> yeah, so there should be an ipfs pastebin without encryption
<fiatjaf> actually, most of the stuff currently being built on ipfs is pointless
<Swedneck> thing is, people don't only use pastebin for temporary things
<Swedneck> people use it for very non-temporary things like entire stories
<lord|> a big annoyance of ipfs is that it doesn't force UTF-8 in the browser
<fiatjaf> publish your stories
<fiatjaf> it also writes to the eternum public gateway
<anacrolix[m]> isn't it only temporary as long as nobody pins it? and isn't the encoding (like utf-8) irrelevant to IPFS?
<Swedneck> hmm, that's probably the best alternative so far
<Swedneck> although the point of pastebin is to be really simple and fast
<Swedneck> hardbin has that down correctly
<lord|> anacrolix[m]: the problem is that firefox doesn't correctly guess UTF-8
<fiatjaf> you said you're going to share a STORY
<lord|> in some cases
<fiatjaf> if you want something fast and disposable use any other non-ipfs pastebin
<Swedneck> like hastebin.com
<fiatjaf> why not?
asura_ has joined #ipfs
<Swedneck> no, i used story as an example
<lord|> hastebin is terrible for one simple reason
<lord|> it requires javascript to view the text
<lord|> otherwise the page is broken with no indication that you have to enable js
<fiatjaf> sprunge.us
<fiatjaf> pastery.net
Bromskloss has joined #ipfs
<Swedneck> fuck me for wanting to spread ipfs i guess
<Swedneck> yay centralization
<lord|> sprunge does not force HTTPS
asura has quit [Ping timeout: 250 seconds]
<lord|> or even support it
<anacrolix[m]> that seems like a shortcoming of firefox, or utf-8. can you use a self-describing format?
<fiatjaf> who cares?
asura__ has joined #ipfs
<fiatjaf> https?
<fiatjaf> YOU SHARING ERROR LOGS
<fiatjaf> you don't want https for that
<lord|> HTTPS is not just encryption
<lord|> it's authentication
<fiatjaf> error logs
<fiatjaf> you don't need that
<Swedneck> why would you not want https for that?
<lord|> it doesn't matter what it's used for
<Swedneck> wtf
<fiatjaf> Swedneck, you've ignored me when I said ipfs isn't good for everything
<lord|> the world's largest DDoS attack was caused by very low HTTPS coverage in china
zeden has quit [Quit: WeeChat 2.2]
<fiatjaf> do you think EVERYTHING should be on ipfs?
<lord|> the chinese government got ISPs to modify the contents of pages to have a bit of javascript to DDoS github
<Swedneck> almost, yes
<fiatjaf> what shouldn't?
<Swedneck> this is what i'm looking to solve
<Swedneck> what shouldn't be on IPFS is things you actively don't want to be preserved, sensitive data
<Bromskloss> Hi! Is there an up-to-date guide for how to add files without copying them? Merely using `--nocopy` doesn't seem to do the trick. (I have also done `ipfs config --json Experimental.FilestoreEnabled true`, without knowing exactly what it means.)
asura_ has quit [Ping timeout: 245 seconds]
reit has joined #ipfs
<Swedneck> which you wouldn't use a pastebin for to begin with
<lord|> (ignore that guy's cloudflare shilling)
<fiatjaf> you have a point
<fiatjaf> but you'll need a way to gather these things, and search for them
<fiatjaf> and the gathering/searching shouldn't be on ipfs
<fiatjaf> if you don't organize the data it'll be impossible for people to pin it and preserve it
<Swedneck> okay look, i'll make this really simple
<Swedneck> dead links suck
<Swedneck> text is easy to store
<Swedneck> ipfs solves this.
<lord|> just use ipfs like you would bittorrent
<fiatjaf> lord|, you've sent a 24min video
<fiatjaf> I'll watch it later
<lord|> have centralized services for listing and searching
<fiatjaf> also
<fiatjaf> that video isn't on ipfs
<lord|> like the good ole pirate bay
<fiatjaf> so maybe I won't watch it
<lord|> or torrents.csv
<fiatjaf> or Swedneck will be mad
<Swedneck> sigh
<fiatjaf> Swedneck, so you're going to store your own error logs on pastebins
<fiatjaf> Swedneck, and you think you're going to solve the problem like that?
<fiatjaf> you're going to die
<Swedneck> i already do, and have done for a while
cwahlers_ has joined #ipfs
<fiatjaf> and your error logs will all be broken
<fiatjaf> unless you get other people to continue your archiving job
<Swedneck> btw didn't you literally make a way to gather and search for stuff on ipfs?
<fiatjaf> which you won't be able to if you don't organize the shit
cwahlers has quit [Ping timeout: 244 seconds]
<Swedneck> moreso gather than search
<fiatjaf> exactly, I did, because I want to get content organized
<fiatjaf> so people can pin it
<fiatjaf> lord|, https://bigsun.xyz/
<fiatjaf> but this is not sufficient anyway
<fiatjaf> for example, I don't know how to fit psatebin content in there, unless you manually create a folder with your pastebin hashes and update it on the site every time you get a new pastebin saved
mischat has quit [Ping timeout: 252 seconds]
<lord|> fiatjaf: what's the problem with doing that?
<fiatjaf> it's a hassle
<fiatjaf> that few people will bear
<fiatjaf> anyway, if someone has a link to the old folder and you delete it from your node, not only the link to the folder (which has the index of all pastes) is lost, but there's no way to know a new version is available
<lord|> I don't think every single ipfs hash would have to recalculate though
<fiatjaf> actually, this is a failure of the protocol
<lord|> just by adding one file
<fiatjaf> the great IPFS vision isn't so great in the end
<fiatjaf> "oh let's just create a global content-addressed system" misses that
<lord|> it's still useful for some things
<lord|> but the hype obviously boiled over
<fiatjaf> the fact that merkle trees can't point to their own future
<Swedneck> there doesn't really need to be discoverability of pastebin stuff, just the preservation of links
<lord|> &
<lord|> ^
<Swedneck> people don't search for pastebin content, they're linked to it
<lord|> yeah
<fiatjaf> and then they pin it?
<lord|> though one thing people do want to search for is free movies :)
<fiatjaf> why would they?
<Swedneck> so when the links go down, that causes quite an issue
<fiatjaf> when I solve a problem after reading a bunch of error logs I want to forget about it
<fiatjaf> now, if there was an easy way to keep that error log organized (along with the other content that helped me solve the error) then I would be inclined to save it
<fiatjaf> for posterity
<lord|> if you put something like project gutenberg on ipfs
<lord|> it would be a good idea
<lord|> because deduplication
<fiatjaf> lord|, I agree
<fiatjaf> I'm using IPFS mostly for that kind of thing
<fiatjaf> there's a guy who actually put project gutenberg, if you want that
<lord|> I tried pinning a larger selection of files than that guy did
<lord|> took too long
<lord|> unless someone else has since hosted every single file?
<fiatjaf> would take a lot anyway
<fiatjaf> what interesting archives do you have there, lord| ?
<lord|> excluding anything audio/visual would be a good idea
<fiatjaf> please sign up for bigsun.xyz and post them there
<fiatjaf> I'm going to sleep
<lord|> copy all the things from https://www.reddit.com/r/IPFS_Hashes/ maybe
<lord|> dunno how many of those links are dead
<fiatjaf> I was asking about you
<lord|> I mean, that's what I might do
<lord|> I don't have anything big pinned on ipfs
<lord|> mostly just error logs
<fiatjaf> not hentai collection
<fiatjaf> so you're not using ipfs correctly
<fiatjaf> :P
<lord|> always preserve hentai collections
<lord|> so that future historians can know about the variety of porn
<Swedneck> hmm, thankfully ipfessay is easily modifiable
<fiatjaf> I just solved the past-future problem
<Swedneck> i think i'll just edit the text a bit, and use that
<fiatjaf> instead of updating your directory, copy the old directory to _old
<fiatjaf> then update
<fiatjaf> and save that hash
<fiatjaf> ~~~~~~~~~~~~
<fiatjaf> PARTY
<fiatjaf> ~~~~~~~~~~~~
Bromskloss has quit [Quit: Page closed]
ruby32 has joined #ipfs
phs^ has quit []
hphs has joined #ipfs
user_51 has quit [Quit: WeeChat 2.3]
mischat has joined #ipfs
thomasanderson has quit [Remote host closed the connection]
sammacbeth has joined #ipfs
dimitarvp has quit [Quit: Bye]
trashhalo[m] has left #ipfs ["User left"]
user_51 has joined #ipfs
hurikhan77 has quit [Ping timeout: 252 seconds]
_whitelogger has joined #ipfs
hurikhan77 has joined #ipfs
sammacbeth has quit [Ping timeout: 250 seconds]
caveat has joined #ipfs
sammacbeth has joined #ipfs
thomasanderson has joined #ipfs
caveat has quit [Ping timeout: 250 seconds]
hurikhan77 has quit [Read error: Connection reset by peer]
hurikhan77 has joined #ipfs
sammacbeth has quit [Ping timeout: 244 seconds]
sammacbeth has joined #ipfs
hurikhan77 has quit [Read error: Connection reset by peer]
hurikhan77 has joined #ipfs
hurikhan77 has quit [Read error: Connection reset by peer]
kapil____ has joined #ipfs
hurikhan77 has joined #ipfs
hurikhan77 has quit [Read error: Connection reset by peer]
hurikhan77 has joined #ipfs
sammacbeth has quit [Ping timeout: 240 seconds]
sammacbeth has joined #ipfs
hurikhan77 has quit [Read error: Connection reset by peer]
hurikhan77 has joined #ipfs
hurikhan77 has quit [Client Quit]
user_51 has quit [Ping timeout: 245 seconds]
user_51 has joined #ipfs
bongozig_ has quit [Ping timeout: 250 seconds]
cwahlers has joined #ipfs
cwahlers_ has quit [Ping timeout: 250 seconds]
nonono has joined #ipfs
<postables[m]> fiatjaf: If you want I can pipe your hashes through my experimental search engine. I index content with Tesseract, TextRank, and Tensorflow. Will probably be adding DHT sniffing to pick up content soon
<DarkDrgn2k[m]> Soooo... vps host lost my encrypted drive holding IPFSs' content
<DarkDrgn2k[m]> can i "ipfs pin add HASH HASH HASH HASH" or does it have to be on per line?
<postables[m]> One per line
<postables[m]> If you want to be real swanky
mischat has quit [Ping timeout: 252 seconds]
lidel` has joined #ipfs
lidel has quit [Ping timeout: 240 seconds]
lidel` is now known as lidel
bongozig_ has joined #ipfs
bongozig__ has joined #ipfs
bongozig_ has quit [Ping timeout: 250 seconds]
_whitelogger has joined #ipfs
lassulus_ has joined #ipfs
lassulus has quit [Ping timeout: 240 seconds]
lassulus_ is now known as lassulus
hurikhan77 has joined #ipfs
Guanin has quit [Remote host closed the connection]
thomasanderson has quit [Remote host closed the connection]
kiera[m] has joined #ipfs
mischat has joined #ipfs
thomasanderson has joined #ipfs
Rboreal_Frippery has joined #ipfs
thomasanderson has quit [Ping timeout: 250 seconds]
mischat has quit [Ping timeout: 260 seconds]
cwahlers_ has joined #ipfs
cwahlers has quit [Ping timeout: 268 seconds]
pvh has joined #ipfs
}ls{ has joined #ipfs
mischat has joined #ipfs
caveat has joined #ipfs
caveat has quit [Ping timeout: 250 seconds]
user_51 has quit [Quit: WeeChat 2.3]
mischat has quit [Ping timeout: 252 seconds]
Kashish has joined #ipfs
Kashish has quit [Quit: Page closed]
caveat has joined #ipfs
caveat has quit [Ping timeout: 240 seconds]
thomasanderson has joined #ipfs
thomasanderson has quit [Ping timeout: 250 seconds]
cwahlers_ has quit [Ping timeout: 240 seconds]
cwahlers has joined #ipfs
dreadukas[m] has joined #ipfs
Rboreal_Frippery has quit [Ping timeout: 245 seconds]
mischat has joined #ipfs
shippy has quit [Quit: Textual IRC Client: www.textualapp.com]
mischat has quit [Ping timeout: 252 seconds]
kapil____ has quit [Quit: Connection closed for inactivity]
yf33 has joined #ipfs
xcm has quit [Remote host closed the connection]
xcm has joined #ipfs
q6AA4FD has joined #ipfs
BeerHall has joined #ipfs
xcm has quit [Remote host closed the connection]
xcm has joined #ipfs
vyzo has quit [Quit: Leaving.]
vyzo has joined #ipfs
<fiatjaf> postables[m], where's that search engine?
kapil____ has joined #ipfs
_whitelogger has joined #ipfs
i1nfusion has quit [Remote host closed the connection]
i1nfusion has joined #ipfs
spinza has quit [Quit: Coyote finally caught up with me...]
spinza has joined #ipfs
cwahlers has quit [Ping timeout: 245 seconds]
cwahlers has joined #ipfs
mischat has joined #ipfs
woss_io has joined #ipfs
spinza has quit [Quit: Coyote finally caught up with me...]
mischat has quit [Ping timeout: 252 seconds]
karwell has quit [Remote host closed the connection]
spinza has joined #ipfs
woss_io has quit [Ping timeout: 246 seconds]
woss_io has joined #ipfs
i1nfusion has quit [Remote host closed the connection]
i1nfusion has joined #ipfs
yf33 has quit [Quit: Textual IRC Client: www.textualapp.com]
i1nfusion has quit [Remote host closed the connection]
i1nfusion has joined #ipfs
nonono has quit [Ping timeout: 246 seconds]
plexigras has joined #ipfs
spinza has quit [Quit: Coyote finally caught up with me...]
spinza has joined #ipfs
cwahlers_ has joined #ipfs
kapil____ has quit [Quit: Connection closed for inactivity]
cwahlers has quit [Ping timeout: 246 seconds]
kapil____ has joined #ipfs
SunflowerSociety has quit [Read error: Connection reset by peer]
<fiatjaf> I want to index all ipfs content too, but not its content. instead I want to do a reverse index.
<fiatjaf> so you can search for a hash Qmxyz... and get other objects that link to it along with their path
<fiatjaf> please help me.
malaclyps has quit [Read error: Connection reset by peer]
malaclyps has joined #ipfs
stoopkid has quit [Quit: Connection closed for inactivity]
asura__ has quit [Remote host closed the connection]
asura__ has joined #ipfs
Lymkwi has quit [Ping timeout: 246 seconds]
stoopkid has joined #ipfs
caveat has joined #ipfs
user_51 has joined #ipfs
caveat has quit [Ping timeout: 250 seconds]
woss_io has quit [Ping timeout: 246 seconds]
zxk has quit [Quit: ZNC - https://znc.in]
asura__ has quit [Remote host closed the connection]
asura__ has joined #ipfs
_whitelogger has joined #ipfs
dimitarvp has joined #ipfs
MDude has quit [Ping timeout: 244 seconds]
kapil____ has quit [Quit: Connection closed for inactivity]
cwahlers has joined #ipfs
cwahlers_ has quit [Ping timeout: 250 seconds]
eater has quit [Ping timeout: 250 seconds]
notkoos has joined #ipfs
leeola has joined #ipfs
<r0kk3rz> not sure you could do that without having a full index of everything
toppler has joined #ipfs
BeerHall has quit [Ping timeout: 250 seconds]
MDude has joined #ipfs
eater has joined #ipfs
ruby32 has quit [Remote host closed the connection]
ruby32 has joined #ipfs
luginbash[m] has joined #ipfs
zxk has joined #ipfs
<deltab> fiatjaf: okay, so you'll need a database to store what you find, and a crawler to find it
<deltab> what kind of links do you want to index?
<jamiedubs[m]1> No all-hands this week right? Is there one for next week?
i1nfusion has quit [Remote host closed the connection]
i1nfusion has joined #ipfs
<DarkDrgn2k[m]> > One per line
<DarkDrgn2k[m]> looks like its not one per line.. pined everythign
<DarkDrgn2k[m]> i was using xarg... didnt do -L1
<DarkDrgn2k[m]> is there any way to see acitvity on a specific hash?
pep7 has quit [Ping timeout: 246 seconds]
cwahlers_ has joined #ipfs
cwahlers has quit [Ping timeout: 244 seconds]
kapil____ has joined #ipfs
<fiatjaf> deltab: how do I crawl this thing?
<fiatjaf> I mean, is there a way to slowly query all hashes on all nodes?
<deltab> no, but you can query the hashes you have
<deltab> it's also possible to watch which hashes are announced
<DarkDrgn2k[m]> http://stats.myipfs.site <- trying to figure out why im running so high in bandwith ussage every day when i dont have any popular ipfs pins pinned
* DarkDrgn2k[m] uploaded an image: image.png (13KB) < https://matrix.org/_matrix/media/v1/download/tomesh.net/lIahqUyvnmOWtWujYpfOHslG >
<DarkDrgn2k[m]> is 11gigs on an "idle" ipfs server normal?
i1nfusion has quit [Remote host closed the connection]
i1nfusion has joined #ipfs
ONI_Ghost has joined #ipfs
<voker57> what's ipfs version?
jpaa has quit [Ping timeout: 272 seconds]
<voker57> it's high
<devoid_> swe
<devoid_> dx
<devoid_> sd
<devoid_> xd
jpaa has joined #ipfs
borkDoy has joined #ipfs
borkDoy has quit [Client Quit]
ygrek has joined #ipfs
<postables[m]> Still pretty WIP, haven't updated the deployed version in awhile
<fiatjaf> nice, postables[m]
<fiatjaf> I mean, I don't know anything about tesseract, tensorflow and textrank, seems like pretty complex
<fiatjaf> how do you plan to do dht sniffing? is there a tutorial for that somewhere?
<fiatjaf> with good search engines, we can live in an internet of pure hashes and no names, I think
<fiatjaf> (except for the name of the search engine?)
<Swedneck> it doesn't really seem to work, at all
<Swedneck> "ipfs" should return some result
thomasanderson has joined #ipfs
<postables[m]> swedneck: are you talking about the search engine? try the queries listed under the weekly ones.
<postables[m]> fiatjaf: For DHT sniffing? dunno yet, simplest way is just `ipfs logs tail` and copy content hashes
Mateon3 has joined #ipfs
Mateon1 has quit [Ping timeout: 240 seconds]
Mateon3 is now known as Mateon1
<Swedneck> ah, that works
<Swedneck> not many results, but still
<postables[m]> like i said its extremely basic and WIP. I haven't given it much content to index yet, it's just some stuff i've uploaded myself, and other stuff people have given it to index
<postables[m]> most of the work has been done on the backend for Lens itself, optimizing how we search through content, etc....
<Swedneck> i'm just adding the fdroid repo to ipfs, wanna index that?
<Swedneck> 50 gigs of stuff
<postables[m]> depends on the kind of content. It can index pretty much any text files, pictures, etc...
<postables[m]> *edit:* ~~depends on the kind of content. It can index pretty much any text files, pictures, etc...~~ -> depends on the kind of content. It can index pretty much any text files, pdf, web pages, pictures, etc...
<Swedneck> well it's mostly APKs and source code, but it has an index built in
<postables[m]> sure when im back at my computer tonight i'll look at pinning that
<postables[m]> wiping the data stored on these nodes in like less than a week tho
<Swedneck> i'm just trying to get a working fdroid mirror on ipfs rn
<Swedneck> my biggest issue is `ipfs add` being dick-punchingly slow
<Swedneck> having the data on HDD doesn't help either
<postables[m]> what're the specs of your node?
<postables[m]> i've been running with bloom filter and hash on read lately and its awesome
<Swedneck> whatnow
<Swedneck> as for specs: shitty old pc from like 2012, if not even earlier
<Swedneck> lemme get detailed specs
<jamiedubs[m]1> Re DHT sniffing for search you should check out ipfssearch
<postables[m]> ah yea dont enable bloom filter or hash on read then lol
<postables[m]> i had to up my ipfs nodes to 40 vCPU cores for hash on read to not cripple it
<postables[m]> bloom filter seems to be quite useful and only memory intensive
<Swedneck> CPU: AMD A8-6500 APU (4) @ 3.5GHz
<Swedneck> i have shit tons of memory
<postables[m]> jamiedubs: thanks ill check it out
<Swedneck> Memory: 4765MiB / 23292MiB
<postables[m]> you should be able to use bloom filter then
<postables[m]> *edit:* ~~you should be able to use bloom filter then~~ -> you should be able to use bloom filter then, i think my nodes have a 512MB filter
<Swedneck> what does bloom filter do?
<postables[m]> lets you check whether or not your node already has nodes from the merkle-dag i believe. you can avoid re-adding merkle-dag nodes that you already have
<postables[m]> speeds up adding files and what not
<postables[m]> hash on read allows you to preserve integrity but its stupidly expensive
<Swedneck> oh fuck yes
<Swedneck> how do i enable bloom filter
<Swedneck> that sounds like exactly what i want
<Swedneck> what do you think is a good size for me?
<postables[m]> im using `536870912`
<postables[m]> *edit:* ~~im using `536870912`~~ -> im using `536870912` with 64GB RAM on each node
<Swedneck> alright
<Swedneck> jesus christ just 524288 took the time to add the repo from 2h to 15 min
<Swedneck> ok maybe not quite, it says 30 min now
<Swedneck> is there a way to calculate how much i need?
<Swedneck> found a link to a calculator but i'm not sure how to use it
<postables[m]> I couldn't get the calculator to work lol
<postables[m]> I just set a large number and yolo 🤷‍♀️
<Swedneck> lol
<Swedneck> i mean i'm not even using 4GB
<Swedneck> might as well go with something ridic
thomasanderson has quit [Remote host closed the connection]
xcm has quit [Remote host closed the connection]
<Swedneck> well, it's at 1h 30min now
<Swedneck> with 1 gig of bloom
<Swedneck> i guess a lot of it just hasn't been added before
thomasanderson has joined #ipfs
xcm has joined #ipfs
cwahlers_ has quit [Read error: Connection reset by peer]
cwahlers has joined #ipfs
<DarkDrgn2k[m]> voker57: veriso nis "AgentVersion": "go-ipfs/0.4.18/",
vquicksilver has quit [Quit: WeeChat 2.2]
vquicksilver has joined #ipfs
luigi has joined #ipfs
<Swedneck> ...now it says 4 hours
<Swedneck> it seems way too slow for even an HDD, it's not at full CPU usage either
mischat has joined #ipfs
mischat has quit [Ping timeout: 252 seconds]
<postables[m]> how many people are providing the content?
<Swedneck> i'm talking about adding it
<Swedneck> not downloading
<Swedneck> oh heck, i forgot to use --nocopy
reit has quit [Ping timeout: 246 seconds]
<voker57> DarkDrgn2k[m]: try git
hphs has quit []
hph^ has joined #ipfs
lord| has quit [Read error: Connection reset by peer]
lord| has joined #ipfs
dodo33[m] has joined #ipfs
cyclical[m] has joined #ipfs
dolphy has quit [Remote host closed the connection]
dolphy has joined #ipfs
mischat has joined #ipfs
<Swedneck> alright i can't seem to get rid of the pins i made
<Swedneck> i ran `ipfs pin rm -r HASH`, and ran gc
<Swedneck> the blocks are still there
<DarkDrgn2k[m]> @swedneck: i found gc doesnt do anything :/
<DarkDrgn2k[m]> some one said it will only act once max storage vallue is hit.. i dk
<Swedneck> oof
<Swedneck> although `ipfs pin rm` is taking a long time even though it should've removed the pins the first time i ran it
<Swedneck> yeah i don't think ipfs is ready for mirroring fdroid repos yet
<Swedneck> ..and it returns "not pinned"
i1nfusion has quit [Remote host closed the connection]
zeden has joined #ipfs
i1nfusion has joined #ipfs
mischat has quit [Remote host closed the connection]
kapil____ has quit [Quit: Connection closed for inactivity]
thomasanderson has quit [Remote host closed the connection]
zeden has quit [Quit: WeeChat 2.2]
<Swedneck> yeah i can't figure out how to remove this data
<Swedneck> my storage limit is 10GB, yet there's 50GB pinned
<fiatjaf> pinning is wrong
<fiatjaf> never pin
<fiatjaf> just add to mfs
<fiatjaf> pinning is broken
<voker57> this
<Swedneck> w-what
<voker57> or check that out https://github.com/ipfs/go-ipfs/pull/4757
<voker57> gc should work tho
<voker57> check that you don't have files pinned via mfs or something
<Swedneck> but i'm running gc
<Swedneck> it's not doing anything
<Swedneck> i do not
<voker57> are you using badgedb?
<voker57> badgerds *
cwahlers_ has joined #ipfs
<Swedneck> no
<Swedneck> flatfs
<voker57> > although `ipfs pin rm` is taking a long time even though it should've removed the pins the first time i ran it
<voker57> another fun feature of IPFS I made a PR for: if pin it not present it searches through your whole repo to check if it is pinned indirectly
cwahlers has quit [Ping timeout: 250 seconds]
<devoid_> hello whyrusleeping[m]
<devoid_> whyrusleepin
<devoid_> whyrusleeping
<devoid_> this question has been asked of me many tmes when i was in high school :(
poppy has joined #ipfs
xcm has quit [Remote host closed the connection]
poppy has left #ipfs [#ipfs]
mischat has joined #ipfs
xcm has joined #ipfs
mischat has quit [Ping timeout: 252 seconds]
mischat has joined #ipfs
caveat has joined #ipfs
caveat has quit [Ping timeout: 250 seconds]
zeden has joined #ipfs
thomasanderson has joined #ipfs
aye7 has joined #ipfs
zeden has quit [Quit: WeeChat 2.2]
thomasanderson has quit [Ping timeout: 246 seconds]
zeden has joined #ipfs
aye7 has quit [Remote host closed the connection]
aye7 has joined #ipfs
zeden has quit [Client Quit]
<fiatjaf> ipfs is mostly broken
<Swedneck> except it's worked just fine for the longest time
<Swedneck> i don't know why it would suddenly start not working without any updates
<fiatjaf> have you ever tried to pin rm stuff?
spinza has quit [Quit: Coyote finally caught up with me...]
<Swedneck> yes, it worked fine in the past
}ls{ has quit [Quit: real life interrupt]
<suitsandtux[m]> voker57:
<voker57> suitsandtux[m]:
<suitsandtux[m]> Oh crap, sorry. Apparently that happened randomly? I was just browsing the convo
<suitsandtux[m]> Must have pushed something
xcm has quit [Remote host closed the connection]
xcm has joined #ipfs
spinza has joined #ipfs
<postables[m]> How is IPFS broken
<postables[m]> If it suddenly stops working, your problem isn't IPFS it's your installation, or your system.
<postables[m]> I've had IPFS running 24/7 in a clustered environment since April of this year, and save for a few issues it's been great
<Swedneck> only issue here is being unable to remove my pins
<postables[m]> Upgrade your repo to badgerds
<postables[m]> flatfs is trash
mischat has quit [Remote host closed the connection]
<Swedneck> won't that take literal hours now that i have 50 gigs of pinned data though
<voker57> badgerds is great except occasionally it breaks down and loses all your data
<voker57> and I never had problems with flatfs
<postables[m]> Define occasionally.
<postables[m]> swedneck: dunno really depends on your system I think.
jamesaxl has quit [Quit: WeeChat 2.3]
<postables[m]> I've done a repo upgrade with I think 30GB and it was over in like an hour or something
<Swedneck> just a sanity check: removing the pin and running gc should clear out the data, right?
<postables[m]> It should
i1nfusion has quit [Remote host closed the connection]
i1nfusion has joined #ipfs
<Swedneck> something's real fucky then
plexigras has quit [Ping timeout: 246 seconds]
<postables[m]> I mean technically you're node is in a broken state
<postables[m]> You're repo is larger than it should be, and your node is probably having a bit of a fit trying to figure out what to do
<Swedneck> hmm
mauz555 has joined #ipfs
<Swedneck> what if i set the limit to above the size it is rn?
<Swedneck> and then try to gc
<postables[m]> I would try shutting down your node, increasing your storage capacity in the config file and restarting it
<postables[m]> That could work
<Swedneck> gonna try that
<postables[m]> Probably wouldn't hurt to turn on debugging either
sammacbeth has quit [Ping timeout: 250 seconds]
<Swedneck> StorageMax, right?
nast has joined #ipfs
<postables[m]> Yea
<LHommedesCitrons> ipfs works pretty well for me except sometimes it hangs when I try to pin something
sammacbeth has joined #ipfs
<Swedneck> nope, didn't work
BeerHall has joined #ipfs
maxzor_ has quit [Remote host closed the connection]
sammacbeth has quit [Ping timeout: 250 seconds]
xcm has quit [Remote host closed the connection]
<LHommedesCitrons> The biggest thing that bugs me about IPFS is the sheer amount of dead links
<LHommedesCitrons> I try to pin as much as possible
xcm has joined #ipfs
<devoid_> goodnight my efllow emberi lenyek
sammacbeth has joined #ipfs
mauz555 has quit []
mischat has joined #ipfs
nast has quit [Quit: Leaving]
Papa_Alpaka has joined #ipfs
Papa_Alpaka has quit [Client Quit]
mischat has quit [Ping timeout: 252 seconds]
thomasanderson has joined #ipfs
aye7 has quit [Ping timeout: 240 seconds]
aye7 has joined #ipfs
q6AA4FD has quit [Quit: ZNC 1.7.1 - https://znc.in]
thomasanderson has quit [Remote host closed the connection]
sammacbeth has quit [Ping timeout: 244 seconds]
<voker57> [00:56:40] <postables[m]> Define occasionally.
<voker57> happened to me once, read on github about second time