I completely gave up torrents for Usenet, also using the -arr’s to get content for Plex. I completely saturate my bandwidth with Usenet downloads and I’ve never once received an ISP letter, and I’ve been entirely without a VPN.
Developer, 11 year reddit refugee
I completely gave up torrents for Usenet, also using the -arr’s to get content for Plex. I completely saturate my bandwidth with Usenet downloads and I’ve never once received an ISP letter, and I’ve been entirely without a VPN.
As someone who completely gave up torrenting for usenet, what made you decide against usenet?
I was using Vanced for around a year, and immediately switched to Revanced when it became available. No issues so far
To elaborate further from the other comment, it’s a person running a copy of the Lemmy software on their server. I for example am running mine (and seeing this thread) from https://zemmy.cc. Thanks to Federation all of our different servers are able to talk to each other so we can have a shared experience rather than everyone being on one centralized instance managed by one set of administrators (like reddit is).
This provides resilience to the network. If reddit goes down, reddit is down. If lemmy.world goes down, you can still access the content of every community that isn’t on lemmy.world, and if other servers were subscribed to the content on a community from lemmy.world you could still see the content from before the server went offline (and it will resync once it’s back up).
If we put all of our eggs into a single basket, we have a single point of failure. If all of the major communities go to lemmy.world then lemmy.world is that single point of failure. Doing that is effectively just recreating the same issues we had with reddit but with extra steps. By spreading larger communities across servers we ensure that the outage (or permanent closure) of a single instance doesn’t take down half the active communities with it.
My friends instance, crystals.rest, is hosted on a $5/mo Linode with 1GB of RAM
Putting all of the large communities on a single instance is just reddit with more steps. It’s good that one of the larger Lemmy communities is not also on the largest Lemmy instance. Lemmy.world suffers a lot of outages (in part because it’s so centralized), meanwhile this community remains available.
Open settings, go to search from the left hand menu, scroll down to the list of search shortcuts and either permanently remove the ones you don’t want, or just click the checkbox next to it and it won’t show up in the address bar.
Also that level of pixelization is easily reversed, better to just black out the parts you don’t want visible.
You could always try Asahi Linux if you’re on a newer MacBook
This is a known bug in 0.18.3, a fix will be in the next release:
The infinite scroll feature is not yet implented in the UI, it’s only been added to the backend
Same experience in Argentina and Paraguay
I think it’s a bit silly to have megathreads just because some users can’t scroll past posts that doesnt interest them.
The problem is there are so goddamn many, to the extent that I’m working on a userscript that lets me entire hide posts that contain keywords. Checking my frontpage using Subscribed/Active, 5 of the first 20 posts are about this “news”. And that’s a full day after it happened, yesterday was far worse
Edit: The userscript is ready!
Is the subjective experience the thing that defines what is the most palatable form of this?
If that’s the case then as someone else suggested they could simply remove the memory of the experience up until right before you walk out the other end. For all you knew it was incredibly excruciating but you’re none the wiser. Would the lack of that memory negate the experience?
Exactly, if you are not annihilated then that means two identical versions of an entity that thinks it’s you exist simultaneously, and now one of them has to be killed to maintain the illusion of this being transport rather than cloning.
Now I want to see a dystopian fiction where the original instances of a person are taken away and used as slave labor while the clones come out the other side thinking they’re the only copy.
Let’s assume the machine works one of two ways. It either destroys the original as it’s read into the machine and reconstructing on the other end, or it’s not destroying the original and simply reading and copying simultaneously.
In the first case there are zero complete copies of you in existence as you’re undergoing a phase of removing information from place and reconstructing it in another, I’d call that death and cloning.
In the second case there are two identical copies of you in existence until they destroy the original, I’d call that a clone.
Quantum entanglement would mean that while it reads your initial state and encodes the new state there are two copies of you in existence, that is cloning, then the initial state dies. Unless the process of reading that state is destructive, then you just die and are cloned.
The method between the two you suggested also means you die momentarily and then are recreated. For the period of time it takes to encode your atoms into a method of transport and then reassemble them at your destination, you no longer exist in complete form.
This question all comes down to your opinion of what makes a person a person, whether that means we have something greater than the collection of our atoms, or whether we are simply the emergent outcome of the complex arrangement of atoms. If you subscribe to the former then you also need to believe that this machine is somehow capable of either transporting/transplanting that “soul” for lack of a better expression. Where if you subscribe to the latter than this is most certainly a suicide cloning machine.
I personally subscribe to the idea that consciousness is an emergent property of complexity. Given a sufficiently large enough series of inputs you can observe new and unexpected outputs that appear to be on higher orders of complexity than their inputs. This response is an example of that, from electrons flowing through transistors we end up with operating systems, hardware IO, web browsers, networking protocols, ASCII standards, font rendering, etc. All of that complexity emerges from a massive amount of on/off switches arranged in patterns over time.
Following this chain of reasoning I believe that making an exact duplicate of me down to the state of each atom is no different than that entity being me, however as a conscious being with human ethics and morals I put value in the singularity of my existence, and so a plurality of Zetaphor is something I find undesirable as it fundamentally challenges my perception of what it means to be myself.
So assuming the entity leaving the transporter is me, there’s two ways to approach the way a machine like this could operate:
That means one of two things, either there is a brief moment of time where two identical copies of me are in the universe, or there is a period of time where zero complete copies of me exist in the universe. So either I stopped existing momentarily and then was recreated from scratch (death and clone birth), or I existed in two places at once and then died in one (cloning and suicide).
If all I experience is being one place one moment and another place the next, then it’s me
If I make an exact molecular copy of you and set that copy free into the world thinking it had just successfully transported, but then I take the original you that entered the transporter and lock them up in a basement somewhere, how is that any different? From the perspective of the conscious being that came out the other end their continuity is uninterrupted. They will think they are the only version of themselves to have ever existed and that they simply moved from one place to another, as opposed to being a duplicate of the original entity, and that the original entity may be dead or in this case locked in a basement.
I’m really enjoying Otterwiki. Everything is saved as markdown, attachments are next to the markdown files in a folder, and version control is integrated with a git repo. Everything lives in a directory and the application runs from a docker container.
It’s the perfect amount of simplicity and is really just a UI on top of fully portable standard tech.