Self-hosting

2782 readers
13 users here now

Hosting your own services. Preferably at home and on low-power or shared hardware.

Also check out:

founded 2 years ago
MODERATORS
1
 
 

Beneath the dark and uncertain clouds of bigtech, hidden among the declassed byte workers and the false technological prophets who with siren songs offer their digital services to "facilitate" digital life, rises an anarchic and countercultural community that seeks to reclaim the Internet and fight against those who squeeze our identity in the form of data to generate wealth and advertising for mass social manipulation and cohesion. Navigating the network of networks, with a small fleet of self-managed servers, geographically distributed yet cohesively united by cyberspace, the self-hosting community emerges as a way of life, a logic of inhabiting the digital, a way of fighting for an open, human network, free from the oligarchy of data.

To the naturalization of the already crystallized phrase "the cloud is someone else's computer" we add that this "someone else" is nothing more than a conglomerate of corporations that, like a hungry kraken, devours and controls the oceans of cyberspace. Against this we arm ourselves in community action, direct and self-managed by and for those of us who inhabit and fight for a more sovereign and just Internet. Our objectives are clear, and our principles are precise. We seek to break the mirage and charm that these beasts imposed at the point of ISPs and blacklist and we promote the ideal of an organized community based on their computing needs without the intermediation of outlaws and byte smugglers.

The big tech companies disembarked on the net with a myriad of free services that came to replace standards established during years of work among users, developers, communities, technocrats and other enthusiasts of the sidereal tide of cyberspace. By commoditizing basic Internet services and transforming them into objects of consumption, they led us to their islands of stylized products, built entirely with the aim of commercializing every aspect of our lives in an attempt to digitize and direct our consumption. Sending an email, chatting with family and friends, saving files on the network or simply sharing a link, everything becomes duly indexed, tagged and processed by someone else's computer. An other that is not a friend, nor a family member, nor anyone we know, but a megacorporation that, based on coldly calculated decisions, tries to manipulate and modify our habits and consumption. Anyone who has inhabited these digital spaces has seen how these services have changed our social behaviors and perceptions of reality, or will we continue to turn a blind eye to the tremendous disruption that social networks generate in all young people or the absurd waste of resources involved in sustaining the applications of technological mega-companies? Perhaps those who praise the Silicon Valley technogurus so much do not see the disaster of having to change your cell phone or computer because you can no longer surf the web or send an email.

If this is the technosolutionism that crypto-enthusiasts, evangelists of the web of the future or false shamans of programming offer us, we reject it out of hand. We are hacktivists and grassroots free software activists: we appropriate technology in pursuit of a collective construction according to our communities and not to the spurious designs of a hypercommercialized IT market. If today the byte worker plays the same role as the charcoal burner or workshop worker at the end of the 19th century, it is imperative that he politicizes and appropriates the means of production to build an alternative to this data violence. Only when this huge mass of computer workers awaken from their lethargy will we be able to take the next step towards the re-foundation of a cyberspace.

But we do not have to build on the empty ocean, as if we were lost overseas far from any coast; there is already a small but solid fleet of nomadic islands, which dodge and cut off the tentacles of the big tech kraken. Those islands are the computers of others, but real others, self-managed and organized in pursuit of personal, community and social needs. Self-hosting consists of materializing what is known as "the cloud", but stripped of the tyranny of data and the waste of energy to which the big tech companies have accustomed us. They are not organized to commoditize our identities, but to provide email, chat, file hosting, voice chat or any other existing digital need. Our small server-islands demonstrate that it is possible to stay active on the network without the violent tracking and theft, nor the imposed need to constantly replace our computer equipment: the self-hosted services, being thought by and for the community, are thought from the highest possible efficiency and not the immoral waste that directly collaborates with the climate crisis.

For this reason, we say to you, declassed byte workers, train yourself, question yourself, and appropriate the tools you use in order to form a commonwealth of hacktivists! Only between the union of computer workers and the communities of self-hosting and hacktivism we will be able to build alternatives for the refoundation of a cyberspace at the service of the people and not of the byte oligarchy.

But we need not only the working masses but also ordinary digital citizens, let's wake up from the generalized apathy to which we have been accustomed! No one can say anymore that technology is not their thing or that computing does not matter to them when all our lives are mediated through digital systems. That android phone that is still alive but no longer allows you to check your emails or chat with your family is simply the technological reality hitting you in the face; as much as the anxiety or dispersion that has existed in you for the last 15 years. Imagine the brain of a 14 year old teenager, totally moth-eaten by the violent algorithms of big tech!

Community digital needs are settled on the shores of our server-islands, not on the flagships of data refineries. Let's unite by building small servers in our homes, workplaces or cultural spaces; let's unite by building data networks that provide federated public instant messaging services that truly respect our freedoms and privacy. Let's publish robust, low-latency voice services; let's encourage the use of low computational consumption services to democratize voices whether you use a boat or a state-of-the-art racing boat. Let's create specialized forums and interconnect communities to unite us all, let's set our sails with the protocols and standards that exist, which allow us to dive the network using the device we want and not the one imposed on us. Let's lose the fear that prevents us from taking the first step and start this great learning path, which as an extra benefit will make us regain not only our technological sovereignty but also the control of our digital essence. It is not a matter of cutting off the private data networks of big tech but rather of building self-managed, self-hosted and self-administered spaces from the hacktivist bases, together with the workers of the byte and the digital citizenship: an Internet of the community for the community.

2
 
 

Consider watching this video with FreeTube, a nifty open-source program that lets you watch YouTube videos without Google spying on your viewing habits!

Combined with Libredirect, which automatically opens youtube links in Freetube, it becomes really slick and effortless to use.

3
4
 
 

Long-awaited release, with the flagship feature of custom attributes! This unlock many integrations, notably Linux user management through PAM.

5
 
 

The founder of Drupal posted recently about this self-hosted and completely solar-powered personal site he made, in Boston of all places.

He describes the hardware, software, and the challenges he ran into while setting it all up. The site even includes automatically updating statistics about the system and battery. There's no backup or fail over, so if the battery drains due to cloudy or cold weather, the website will simply go offline for a while and he's fine with that.

6
7
 
 

cross-posted from: https://lemmy.world/post/20066526

Features:

  • Distroless

  • -THE- smallest nextdns docker image there is

  • With riscv support

  • Both Dockerfile and docker-compose provided @ op link

Enjoy.

8
 
 

Hey guys!

I want to convert my now corebooted Thinkpad T430 into a Nextcloud server and possibly more (Syncthing, maybe Tor, maybe more)

1 500GB SSD, 1 1TB SSD

Currently runs Fedora Kinoite, I could rebase to something like secureblue uCore, Fedora IoT, uBlue uCore, ...

Not sure if those would have broken configs though.

Maybe I would prefer something with slower pace, but tbh the pace of CentOS bootc becoming a thing is quite frustrating. This would likely be the perfect 'install and forget' distro for many, a KDE Image would be there in no time.

I wouldnt want to use a traditional distro, even though a base Debian or AlmaLinux/ Rockylinux (what the hell was that of a hydra? Cut off one head, spawn 2? what are the differences??) could just be fine. I used Debian in the past, it really just works.

I would like

  • Nextcloud AIO docker image, maybe with podman? It is supposedly more secure but the world runs on Docker, and all is fine. Podman is a pain quite often.
  • some nice management like Cockpit
  • dyn DNS, for example with NoIP, best free
  • secure ssh, that should be no issue
  • btrfs? or zfs? with backups to a secondary drive
  • automatic updates with snapshot creation. Atomic system would be easiest here.
  • easy to use and secure reverse proxy, with DynDNS for reliable address on the internet. NGINX, Traefik, Caddy, what is the best here??

Here I am not sure if I should use 1TB + 1TB, or 500GB used and 1TB backup. BTRFS backups can be incremental.

while I made a list of BTRFS tools I still have no idea what the best tool for this job is.

9
10
11
12
 
 

cross-posted from: https://lemmy.pierre-couy.fr/post/653426

This is a guide I wrote for Immich's documentation. It features some Immich specific parts, but should be quite easy to adapt to other use cases.

It is also possible (and not technically hard) to self-host a protomaps release, but this would require 100GB+ of disk space (which I can't spare right now). The main advantages of this guide over hosting a full tile server are :

  • it's a single nginx config file to deploy
  • it saves you some storage space since you're only hosting tiles you've previously viewed. You can also tweak the maximum cache size to your needs
  • it is easy to configure a trade-off between map freshness and privacy by tweaking the cache expiration delay

If you try to follow it, please send me some feedback on the content and the wording, so I can improve it

13
 
 

cross-posted from: https://programming.dev/post/18360806

Hi everyone,

I would like to enable Cross-Origin Resource Sharing on my Nginx server. for few origins (cors requestor)/domains.

I've found this article https://www.juannicolas.eu/how-to-set-up-nginx-cors-multiple-origins that is nice, but not complete and on my browser seem really hard to read due to the layout 🤮

So I've opened a CodeBerg git repository for the good soul that want to perfect this piece of code the allow the most of use to use CORS with Nginx.

https://codeberg.org/R1ckSanchez_C137/BestOfxxx/src/branch/main/Nginx/CORS_MultiDomains.py

If you don't want to create an account on codeberg feel free to post your code here !

server {
    # Server

    map "$http_origin" $cors { # map in Nginx is somewhat like a switch case in a programming language.
        default ''; #Seem to set $cors to '' empty string if none of the follwing rexeg match ?
        "~^https:\/\/([\w-_\.]+\.)?example.com$" "$http_origin";
            #regex domain match
            # ~ mean I suppose the string is RegEx ?
            # Need to come with a RegEx expression that match https://anything.example.com/[optional ports and Query string ?X=Y]
        "~^https:\/\/([\w-_\.]+\.)?example2.com$" "$http_origin"; #regex domain match
        }
               

    location /static {
        
        # if preflight request, we will cache it
        if ($request_method = 'OPTIONS') {
            add_header 'Access-Control-Max-Age' 1728000; #20 days
            add_header 'Content-Type' 'text/plain charset=UTF-8';
            add_header 'Content-Length' 0;
            return 204; #https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/204 }

        if ($cors != "") {
            add_header 'Access-Control-Allow-Origin' "$cors" always; # <-- Variable $cors
            add_header 'Access-Control-Allow-Credentials' 'true' always;
            add_header 'Access-Control-Allow-Methods' 'GET, POST, PUT, DELETE, OPTIONS' always;
            add_header 'Access-Control-Allow-Headers' 'Accept, Authorization, Cache-Control, Content-Type, DNT, If-Modified-Since, Keep-Alive, Origin, User-Agent, X-Requested-With' always;}

       # configuration lines...

    }
}

}
14
15
16
 
 

does anyone here have experience hosting a Signal proxy and/or a Tor relay? there's a blog post on signal.org asking for folks to help, and i can but i don't know enough about network security to feel safe/confident doing some of this stuff. same with Tor - i've always wanted to host an exit relay (and in fact have this whole long theory about how every public library in the US should host an exit relay, but that's for another post someday maybe).

do any of you have experience with doing this? what kind of best practices would you recommend? any good resources on protecting your network that you might point me to? i will be getting my Net+ cert within the next year but for now i am starting from "enthusiastic beginner" and want to be helpful, but careful.

17
 
 

This may be deemed slightly off topic but I felt like this community might know the answer to this. I'm looking for a way to permanently embed information about who is in a photo, but when I search Google I just get some forum posts from 10 years ago. Surely there is something more recent? How would you go about doing this? Let's assume they are JPG.

I thought about this when looking through photos from my grandparents, where the names are written on the back of the photo. I have many digital photos from ten years ago and I've already forgotten the names of some of the people so imagine what it will be like in another 30 years.

18
 
 

Obligator is a relatively simple and opinionated OpenID Connect (OIDC) Provider (OP) server designed for self-hosters.

19
20
21
22
23
 
 

GoToSocial is an ActivityPub social network server, written in Golang.

24
 
 

Basically a modernised PCengine APU4, which sadly got discontinued.

25
1
submitted 5 months ago* (last edited 5 months ago) by hellfire103@lemmy.ca to c/selfhosting@slrpnk.net
 
 

I'm looking to buy a router for home use, on which I plan to install OpenWRT. After some research, I have come across the TP-LINK Archer AX23, which checks all of the boxes I have:

  • [x] Comparatively low price
  • [x] Supports WPA3
  • [x] Supported by OpenWRT
  • [x] Has at least three LAN ports

However, before I and my dad go and buy one, it has to pass the final test: the forums.

Has anyone used this router before? What was your experience? Can I do better, or have I found the best router ever made? Please share your thoughts.

view more: next ›