Sounds like a regulatory solution is needed. The intersection where domestic policy impacts international.
Sounds like a regulatory solution is needed. The intersection where domestic policy impacts international.
This.
Thanks to Meta BTRFS is apparently got/getting it to a certain extent too: https://youtu.be/6YIc2fVLVPU?si=ngiHWS0fw2zIHf2M
Back before the media decided it wasn’t a competitor but rather a potential profit source. I do think the government does need to have it’s own alternatives (obviously not identical more on this one day) for other reasons, such as for it’s own media releases, but more internationally coordinated appropriate & considered legislation is probably better.
I asked chatgpt to write a go program for this, this looks roughly correct (I have used both libraries before) obviously this won’t be enough for your particular use case. I imagine you can integrate an RSS feed to your site, however if you’re using something like hugo perhaps output it as a csv.
Super low effort but a good start I think:
package main
import (
"fmt"
"log"
"os"
"strings"
"time"
git "github.com/go-git/go-git/v5"
rss "github.com/jteeuwen/go-pkg-rss"
)
const (
timeout = 5 // timeout in seconds for the RSS feed generation
)
// Repository represents a git repository with its URL
type Repository struct {
URL string
}
// Repositories is the list of git repositories
var Repositories = []Repository{
{URL: "https://github.com/owner/repo1"},
{URL: "https://github.com/owner/repo2"},
// Add more repositories here
}
// FetchLatestTag fetches the latest tag from a git repository
func FetchLatestTag(repoURL string) (string, string, error) {
// Clone the repository to a temporary directory
dir, err := os.MkdirTemp("", "repo")
if err != nil {
return "", "", err
}
defer os.RemoveAll(dir)
_, err = git.PlainClone(dir, true, &git.CloneOptions{
URL: repoURL,
Progress: os.Stdout,
})
if err != nil {
return "", "", err
}
repo, err := git.PlainOpen(dir)
if err != nil {
return "", "", err
}
tags, err := repo.Tags()
if err != nil {
return "", "", err
}
var latestTag string
var latestCommitTime time.Time
err = tags.ForEach(func(ref *plumbing.Reference) error {
tag := ref.Name().Short()
commit, err := repo.CommitObject(ref.Hash())
if err != nil {
return err
}
if commit.Committer.When.After(latestCommitTime) {
latestCommitTime = commit.Committer.When
latestTag = tag
}
return nil
})
if err != nil {
return "", "", err
}
return latestTag, latestCommitTime.Format(time.RFC1123Z), nil
}
// GenerateRSS generates an RSS feed from the latest tags of the repositories
func GenerateRSS() string {
feed := rss.Feed{
Title: "Latest Tags from Git Repositories",
Link: &rss.Link{Href: "http://example.com/"},
Description: "This feed provides the latest tags from a list of git repositories.",
Created: time.Now(),
}
for _, repo := range Repositories {
tag, date, err := FetchLatestTag(repo.URL)
if err != nil {
log.Printf("Error fetching latest tag for repository %s: %v", repo.URL, err)
continue
}
feed.Items = append(feed.Items, &rss.Item{
Title: fmt.Sprintf("Latest tag for %s: %s", repo.URL, tag),
Link: &rss.Link{Href: repo.URL},
Description: fmt.Sprintf("The latest tag for repository %s is %s, created on %s.", repo.URL, tag, date),
Created: time.Now(),
})
}
var rssFeed strings.Builder
rssFeed.WriteString(xml.Header)
if err := feed.Write(&rssFeed); err != nil {
log.Fatalf("Error generating RSS feed: %v", err)
}
return rssFeed.String()
}
func main() {
rssFeed := GenerateRSS()
fmt.Println(rssFeed)
}
I like the idea and have been meaning to build / find something like this however this does a little too much and in not quite the way I want. But it’s cool for those who need this exact implementation.
Gentoo, after a 15 year break where I used Ubuntu / Arch. Might try NixOS or something similar.
KDE for desktop env.
Thanks. I didn’t know, it is also on my list.
That part was understood. I don’t think I could complete 1 game in that period of time.
Burn read only backups.
Not sure 12 hours is enough time for me to grab much. Perhaps my backlog.
What man pages are for
I’m contemplating trying to run the meta bridge locally to get around that issue, it has to do with their server running in I think Finland?
At some point they said that after beta it would be $9 a month. But that messaging seems to have disappeared.
I feel like I have been doing this all my life. I think it’s more to do with the depth of understanding too. But the environment has to support it, if there is an expectation that everyone is an expert from day one, and there is no room for self improvement then it can’t be done.
As stated there are down falls with the approach such as lack of exposure to new ideas. You still need to look just not study. But to me it’s also a work/life balance policy. But don’t practice it in extreme as it can hold you back. Good work places should allow for some learning time and I’m hoping that gets normalized.
Forgot /dev/hdx
?
The busybox one seems great as it comes with shells. php looks like it would add some issues.
Personally since I use go, I would create a go embedded app, which I would make a deb, rpm, and a dockerfile using “goreleaser”
package main
import (
"embed"
"net/http"
)
//go:embed static/*
var content embed.FS
func main() {
http.HandleFunc("/", func(w http.ResponseWriter, r *http.Request) {
// Serve index.html as the default page
http.ServeContent(w, r, "index.html", nil, content)
})
// Serve static files
http.Handle("/static/", http.StripPrefix("/static/", http.FileServer(http.FS(content))))
// Start the server
http.ListenAndServe(":8080", nil)
}
Would be all the code but allows for expansion later. However the image goreleaser builds doesn’t come with busybox on it so you can’t docker exec
into it. https://goreleaser.com/customization/docker/
Most of the other options including the PHP one seem to include a scripting language or a bunch of other system tools etc. I think that’s overkill
Anyone else here concerned about what this means for the health of the ecosystem? If reddit was never sustainable and we are well and truly past a phase of consolidation there is potentially a lot of history / info to loose here. The damage has been done already by the funding model. While the return to federation and private hosting is nice, there is a potential “dark” age.
I feel discord does really well because the way it structures it “servers” really focuses around individuals rather than groups. Which then creates an incentive for a certain type of person to “grow their server” bringing more activity onto discord. This is confounded by both a) you join all channels on a server, 2) the ability of individuals to “mute” servers or channels; combined it means it fills up with a bunch of idlers in a way which is worse than IRC as it’s unlikely they will ever read the contents or participate beyond asking a question then leaving.
Limited time to build something so you have to pick based on a couple factors, often largest % of users.
The fact that is is from LA Times shows that it’s still significant though