IPFS is probably spiritually closer to BT in terms of censorship and content sharing, and could be a valid replacement if the right incentive based sharing applications are built over it. Right now there is little incentive to seek out and add content to IPFS, if that can be cracked then BT could be wiped out by it.
However, the trend now for most people is to subscribe to spotify, netflix and amazon prime, thinking there is an endless selection of great and original content. Instead you quickly run out of that and end up listening to and watching whatever the algorithm tells you to.
Spotify actually does have a very large number of songs, and I have no trouble finding classics in many genres - as well as completely new work.
That may be true, but my anecdotal experience shows that I will more often have specific current artists or songs suggested to me than any classics or long-forgotten gems, lending credence to the theory that Spotify cares less about what I'd like and more about what it wants me to listen to
For me, what often replaces Bittorrent for music is YouTube; still built on copyright infringement, but very unlikely to be blocked or get you in trouble, and it's easier than Spotify (no need for accounts).
e.g. When you pull up an auto-generated playlist or look at recommended suggestions, are you just looking at content which the provider prefers to show you? Can you trust reviews from the same company that's selling you the content? If some content isn't on there, will it just disappear into the abyss? If the provider decides to drop some content, will you ever see it again?
And yes, if content isn't on Spotify for me, it basically doesn't exist. If it's on Bandcamp maybe I will download it.
Spotify allows you to see songs that you have added that have been removed from availability as greyed-out names.
Large is relative. They have a larger number of Songs, but the actual choice is quite small. It's limited by age and countrys, and even types. They Catalog consists mostly of popular western music and music from the last 10-15 years it seems. Non-Western music or less popular music from the 20th century is quite hard to find on spotify, or any other music-streaming-service. Youtube is a better source for this and they have limitations too. Spotify used to have also audiobooks, but I think they disappeared in the meanwhile because of competing commercial audiobook-streamingservices.
We live in a great world today with all the information and entertainment cheaply available for everyone...as long as you just follow the local mainstream and have not much demand on specific details. It's great and sad at the same time.
The video streaming industry is moving to silos of content with varying levels of success which, regardless of the price, is bad for the user experience.
For example the HBO Go app is bad in terms of usability and (at least in my country) the streaming quality is really bad. I can watch 4K HDR in Netflix, no problem it starts instantly, but HBO Go always looks like some low bandwidth 720p content, specially on camera pans.
"DOS is still great, Why does it need a replacement?"
If something is robust and efficient, why does there need to be something new to replace it? It serves the purpose and solves the problem well, why would you replace it with something new simply for the same of something new?
DOS isn't a great example, DOS is a much bigger system than something like BT. BT is more of a protocol than a piece of software, and even in that case there are piece of software which have existed for decades without the need to be replaced for the reasons above.
New does not mean better, and just because something is old does not mean it's bad or there is something better out there. Perhaps we come up with something that comes after torrents, but we shouldn't do it for the sake of it, it should be because torrents don't work for a use case or we've come up with a faster / more robust way of doing it.
It works perfectly fine as it is in my opinion.
It uses a P2P protocol called Dat, which is similar to BitTorrent but supports changes to the data and realtime streaming https://www.datprotocol.com/
It's file-sharing mixed into the Web. Beaker uses Dat as a drop-in replacement for HTTP, which means you can browse Dat sites, create new sites within the browser, and use Web APIs to read/write/watch files https://beakerbrowser.com/docs/apis/dat.html
The people using these generally have no clue they are pirating. It is just this box they bought that offers great value.
This is what has replaced Bittorrent because it is immediately accessible to the masses for $199 or whatever those boxes cost.
some addons that are just fronts for piracy sites that host videos but there are some that use Bittorrent to stream content.
getting a box is extremely cheap too. buy any chinese android box and connect it to your tv and you are good to go. but more mainstream there's people who buy boxes, install the addons and resell them at a higher price to people who don't want to mess with technology.
The other is the increasing orwellization of online life, with most computer-mediated interactions and data storage happening in one of a few silos, all of which are cozy with the authorities.
For video content as well as content that's not massively popular in the population, as it was before, file sharing still provides the best experience and the best choices (content, bandwidth requirement, file sizes, file formats, etc.) for those who know how to use it.
I have daily withdrawals from Illustrator, though.
I think that shows why Bittorrent might be here to stay for a while, it's willing to make pretty deep changes to stay with the times.
Using open web standards, WebTorrent connects website users together to form a distributed, decentralized browser-to-browser network for efficient file transfer.
Existing BitTorrent clients, on the other have something to lose:
* WebTorrent peers are less likely to be connectible (behind NAT etc.)
* WebTorrent peers are more likely to "hit and run" (by navigating away from the page, you stop seeding)
* WebTorrent peers are more likely to favor sequential downloads instead of rarest first (for in-browser playback)
Being incompatible with regular BitTorrent means it does not get any utility from existing swarms (unless regular BitTorrent clients gain widespread support; unlikely).
This means WebTorrent is mostly relevant for content delivery, for use cases a site offloads bandwidth usage onto their uses (like PeerTube, for example), and the site itself always runs a seed.
Network speeds have increased sufficiently that keeping vast local libraries of content the way we used to makes less sense than streaming; I get speeds from Netflix or Amazon comparable or better than I used to get over USB from external hard disks.
But jokes aside, I don't see much need for something to replace BitTorrent as a mean of distributing content. Gradual improvements sure, something completely new not really.
Where big changes are going to be needed is content discovery - as big torrent search sites disappear it's becoming more apparent they and not the trackers are the weak point.
The idea is that you can actually query stuff inside a torrent; like actually perform an SQL query and only the pieces relevant to the query will be downloaded.
This sort of allows for distributed querying if you think about it, since your query could be satisfied by many different peers that are seeding/leeching.
Imagine a Web built this way. Where sites are served by people that visit them and are not just static sites, but fully queryable as you'd expect in the normal HTTP web.
I wrote more on this here: https://medium.com/@lmatteis/torrentnet-bd4f6dab15e4
I would say that BitTorrent just fits is particular design constraints fairly well, so I don't see anything replacing it without changing the use case patterns.
In other words, BitTorrent will be the next BitTorrent.
Compare modern clients that can easily handle seeding tens of thousands of files across hundreds of different trackers with minimal resource utilization with the reference client people used back in the early 2000s. You only had one way to connect to trackers and other clients, the traffic was trivial to detect and filter, and the client only support one file at a time along with using 100% of a cpu.
Some content providers are using P2P to complement their CDNs with more offnet capilarity (cannot tell details in public).
Others like Alibaba are using P2P to syncronize content among servers. Thing of it as a faster and simpler way to have server farm contents un sync.
What I'm missing yet is using P2P for configuration, and backup.
The first is Scale. Ever greater scale in terms of data, number of endpoints, workflows, locations… you name it. Moore’s law means more computers doing more things at every point in the Enterprise, but especially at the edge. Scaling enterprise systems will be one of the great challenges over the next decade and not just for the web monsters. Everyone will need to contend with scale. The good news is nothing scales like P2P. It’s organic. When every client is also a server, greater scale simply means even more supply and faster speeds. It’s the reason 20 engineers at bittorrent could build software that moves whole percentage points of total Internet traffic (link to recent exabyte blog post?), traffic volume ten or twenty times the size of even the largest websites with thousands of employees and large infrastructure and operating budgets.
The second is Reliability. Making workflows reliable is quite expensive with traditional architectures. Building in redundancy for high availability can be very expensive and grossly inefficient in the client-server model. Conversely, P2P is naturally resilient and reliable when there are many potential sources for the data. If one is not available, no problem, pick among the thousands of alternatives. P2P is resilient while remaining efficient in resource utilization.
Finally, as mentioned above, there is a growing need to operate at the edge of the network, where resources are spread over a wide area and potentially over low-quality networks. If the experts are right, there will soon be 10 times the data volume at the edge as in the cloud. (see gartner). Managing workflows at the edge of the network will require a centrally managed P2P solution (like Resilio) utilizing a combination of technologies to solve all of the above.
Managing Enterprise workflows reliably and at scale, in the cloud or on the edge, favors a p2p solution and you already see it in various IoT, container orchestration and edge computing architectures. Expect to see more.
If you talk about it in terms of corporate use / pirating fads / user experience etc. I don't know At this point the success is not technical anymore and will depend on the network effect (pun non intended).
I'm aware that both Google Drive and Dropbox can create zip archives of whole folders, but who guarantees that the content hasn't been deleted or replaced? Integrity is more important than convenience.
Is there a self-hosted solution that does? Like IPFS or Dat?
But I doubt anything will displace BitTorrent anytime soon. It's quiet entranched at this point
The documentation is very light on for now though. There are some large organizations that use NFS or AFS in large networks, IPFS looks like a promising replacement, but I think they would need more documentation and stability before considering a switch.
I predict bittorrent will almost fully decline in around a decade in this tempo.
If you are asking for movie/tv-series piracy successor then streaming is already taking the bittorrent place.
As to apps and games it's only matter of time and they can't be pirated anymore. There is big push to make games stream only + secure enclaves (like on Intel new i7) will make software uncrackable.