• 0 Posts
  • 38 Comments
Joined 3 years ago
cake
Cake day: January 17th, 2022

help-circle



  • the world runs off GitHub whether we like it or not

    It doesn’t and we don’t like it anyway.

    PS: to clarify, yes GitHub is wildly popular but, and the kernel is a particularly interesting example, it does not host ALL projects, only a lot of popular ones. A lot of very popular ones are also NOT there but rather on their own git, mailing list, GitLab instance, Gitea, etc. It’s a shortcut, I understand that, but by ascertaining it as “truth” it’s hiding a reality that is quite different and showing that reliable alternatives do exist.




  • utopiah@lemmy.mltoLinux@lemmy.ml*Permanently Deleted*
    link
    fedilink
    arrow-up
    10
    ·
    edit-2
    11 days ago

    main difference between raster graphics and vector graphics was the quality

    It’s not. The primitives, the most basic constitutive building blocks, are different, for raster it’s the pixel (a mix of colors, e.g. red/green/blue) whereas for vector it’s the … vector (a relative position elements, e.g. line, circle, rectangle or text start with).

    This is a fundamental distinction on how you interact with the content. For raster you basically paint over pixels, changing the values of pixels, whereas for vector you change values of elements and add/remove elements. Both can be lossless though (vector always is) as for raster can have no compression or lossless compression. That being said raster does have a grid size (i.e. how many pixels are stored, e.g. 800x600) whereas vector does not, letting you zoom infinitely and see no aliasing on straight lines.

    Anyway yes it’s fascinating. In fact you can even modify SVG straight from the browser, no image editor or text editor needed, thanks to your browser inspector (easy to change the color of a rectangle for example) or even the console itself then via JavaScript and contentDocument you can change a lot more programmatically (e.g. change the color of all rectangles).

    It’s a lot of fun to tinker with!




  • never could get away from Windows entirely. Especially for gaming, and a few critical apps.

    Been gaming exclusively on Linux now for few years, including in VR. Just few hours ago before my work day I was playing Elden Ring with controller. 0 tinkering, System key, “EL”[ENTER] then play. So… unless you need kernel level anti-cheat, Linux is pretty good for gaming nowadays.

    Same of the few “critical” apps, I don’t know what these are but rare are the ones without equivalent and/or that don’t work with Wine, sometimes even better that on Windows.

    Anyway : Debian. Plain and simple, not BS with a mix bag of installers (but you can still use AppImage or am or even nix whenever you want to). It just works and keep on working.




  • The propaganda aspect is import so I’m adding this to a reply rather than yet another edit.

    This research is interesting. What the article tries to do isn’t clarifying the work rather than put a nation “first”. Other nations do that too. That’s not a good thing. We should celebrate research as a better understanding of our world, both natural and engineered. We should share what has been learned and built on top of each other.

    Now when a nation, being China, or the US, or any other country, is saying they are “first” and “ahead” of anybody else, it’s to bolster nationalistic pride. It’s not to educate citizens on the topic. It’s important to be able to disentangle the two regardless of the source.

    That’s WHY I’m being so finicky about facts in here. It’s not that I care about the topic particularly, rather it’s about the overall political process, not the science.


  • Thanks for taking the time to clarify all that.

    It’s not a typo because the paper itself does mention 3090 as a benchmark.

    I do tinker with FPGAs at home, for the fun of if (I’m no expert but the fact that I own few already shows that I know more about the topic than most people who don’t even know what it is, or what it’s for) so I’m quite aware of what some of the benefits (and trade of) can be. It’s an interesting research path (again, otherwise I wouldn’t even have invested my own resources to learn more about that architecture in the first place) so I’m not criticizing that either.

    What I’m calling BS on… is the title and the “popularization” (and propaganda, let’s be honest here) article. Qualifying a 5 years old chip as flagship (when, again, it never was) and implying what the title does, is wrong. It’s overblown otherwise interesting work. That being said, I’m not surprised, OP share this kind of things regularly, to the point that I ended up blocking him.

    Edit: not sure if I really have to say so but the 4090, in March 2025, is NOT the NVIDIA flagship, that’s 1 generation behind. I’m not arguing for the quality of NVIDIA or AMD or whatever chip here. I’m again only trying to highlight the sensationalization of the article to make the title look more impressive.

    Edit2: the 5090, in March 2025 again, is NOT even the flagship in this context anyway. That’s only for gamers… but here the article, again, is talking about “energy-efficient AI systems” and for that, NVIDIA has an entire array of products, from Jetson to GB200. So… sure the 3090 isn’t a “bad” card for a benchmark but in that context, it is no flagship.

    PS: taking the occasion to highlight that I do wish OP to actually go to China, work and live there. If that’s their true belief and they can do so, to not solely “admire” a political system from the outside, from the perspective of not participating to it, but rather give up on their citizenship and do move to China.