Nice, how does it compare to Kdenlive?
Nice, how does it compare to Kdenlive?
oculus software for my vr
Check https://lvra.gitlab.io/ for plenty of options. I’m playing VR on Linux but it’s using SteamVR with the Index.
the world runs off GitHub whether we like it or not
It doesn’t and we don’t like it anyway.
PS: to clarify, yes GitHub is wildly popular but, and the kernel is a particularly interesting example, it does not host ALL projects, only a lot of popular ones. A lot of very popular ones are also NOT there but rather on their own git, mailing list, GitLab instance, Gitea, etc. It’s a shortcut, I understand that, but by ascertaining it as “truth” it’s hiding a reality that is quite different and showing that reliable alternatives do exist.
It’s federated, so one can setup whatever instance they want on whatever domain they want.
If the admin feels “.wtf” is edge, cool. If someone else believe it’s NSFW or wouldn’t help promote the cause, they can setup another instance on another domain. If the content itself is federated, they might share that link instead.
True, in fact I’ve done so myself (simplifying a curve resulting of hand sketching). Still I’d argue that’s not the expected behavior of storing the vector file but rather explicitly modifying it.
main difference between raster graphics and vector graphics was the quality
It’s not. The primitives, the most basic constitutive building blocks, are different, for raster it’s the pixel (a mix of colors, e.g. red/green/blue) whereas for vector it’s the … vector (a relative position elements, e.g. line, circle, rectangle or text start with).
This is a fundamental distinction on how you interact with the content. For raster you basically paint over pixels, changing the values of pixels, whereas for vector you change values of elements and add/remove elements. Both can be lossless though (vector always is) as for raster can have no compression or lossless compression. That being said raster does have a grid size (i.e. how many pixels are stored, e.g. 800x600) whereas vector does not, letting you zoom infinitely and see no aliasing on straight lines.
Anyway yes it’s fascinating. In fact you can even modify SVG straight from the browser, no image editor or text editor needed, thanks to your browser inspector (easy to change the color of a rectangle for example) or even the console itself then via JavaScript and contentDocument
you can change a lot more programmatically (e.g. change the color of all rectangles).
It’s a lot of fun to tinker with!
Meanwhile https://www.europarl.europa.eu/petitions/en/petition/content/0729%252F2024/html/Linux%2Bstatt%2BWindows just closed with 2474 Supporters
I also have a SteamDeck and it’s IMHO one of the best device to promote Linux. Just hand skeptic the device, let them play and ask them how the experience then if they can guess the OS.
never could get away from Windows entirely. Especially for gaming, and a few critical apps.
Been gaming exclusively on Linux now for few years, including in VR. Just few hours ago before my work day I was playing Elden Ring with controller. 0 tinkering, System key, “EL”[ENTER] then play. So… unless you need kernel level anti-cheat, Linux is pretty good for gaming nowadays.
Same of the few “critical” apps, I don’t know what these are but rare are the ones without equivalent and/or that don’t work with Wine, sometimes even better that on Windows.
Anyway : Debian. Plain and simple, not BS with a mix bag of installers (but you can still use AppImage or am
or even nix
whenever you want to). It just works and keep on working.
Another Debian suggestion here, including for gaming and even VR. It basically just works.
Looks like https://old.reddit.com/r/kde/comments/d3m0fz/how_to_open_links_in_mpv_with_klipper/ is a good starting point, i.e
then… to try! :D I’m just discovering this too but seems like the right way.
That said I’d be cautious and limit the use case to only what you have, e.g. Spotify links, at least at first because I imagine one can get into hairy edge cases quickly.
Keep us posted!
The propaganda aspect is import so I’m adding this to a reply rather than yet another edit.
This research is interesting. What the article tries to do isn’t clarifying the work rather than put a nation “first”. Other nations do that too. That’s not a good thing. We should celebrate research as a better understanding of our world, both natural and engineered. We should share what has been learned and built on top of each other.
Now when a nation, being China, or the US, or any other country, is saying they are “first” and “ahead” of anybody else, it’s to bolster nationalistic pride. It’s not to educate citizens on the topic. It’s important to be able to disentangle the two regardless of the source.
That’s WHY I’m being so finicky about facts in here. It’s not that I care about the topic particularly, rather it’s about the overall political process, not the science.
Thanks for taking the time to clarify all that.
It’s not a typo because the paper itself does mention 3090 as a benchmark.
I do tinker with FPGAs at home, for the fun of if (I’m no expert but the fact that I own few already shows that I know more about the topic than most people who don’t even know what it is, or what it’s for) so I’m quite aware of what some of the benefits (and trade of) can be. It’s an interesting research path (again, otherwise I wouldn’t even have invested my own resources to learn more about that architecture in the first place) so I’m not criticizing that either.
What I’m calling BS on… is the title and the “popularization” (and propaganda, let’s be honest here) article. Qualifying a 5 years old chip as flagship (when, again, it never was) and implying what the title does, is wrong. It’s overblown otherwise interesting work. That being said, I’m not surprised, OP share this kind of things regularly, to the point that I ended up blocking him.
Edit: not sure if I really have to say so but the 4090, in March 2025, is NOT the NVIDIA flagship, that’s 1 generation behind. I’m not arguing for the quality of NVIDIA or AMD or whatever chip here. I’m again only trying to highlight the sensationalization of the article to make the title look more impressive.
Edit2: the 5090, in March 2025 again, is NOT even the flagship in this context anyway. That’s only for gamers… but here the article, again, is talking about “energy-efficient AI systems” and for that, NVIDIA has an entire array of products, from Jetson to GB200. So… sure the 3090 isn’t a “bad” card for a benchmark but in that context, it is no flagship.
PS: taking the occasion to highlight that I do wish OP to actually go to China, work and live there. If that’s their true belief and they can do so, to not solely “admire” a political system from the outside, from the perspective of not participating to it, but rather give up on their citizenship and do move to China.
Well, I honestly tried (cf history). You’re neither addressing my remark about the fact from the article nor the bigger picture. Waste of time, blocked.
Unfortunately my model isn’t supported. I might look for a 2nd hand supported one with the USB adapter and try, as I do use and work with Linux on a daily basis.
Based on https://old.reddit.com/r/mildlyinfuriating/comments/1jb2uvt/roomba_accidentally_saw_outside_and_now_i_cant/ I’d bet some models surely do.
That being said, I am NOT promoting Roomba or any other brand, I’m only highlighting that apps aren’t necessarily a requirement for the basic feature.
Finally, as others suggested if one genuinely does need such feature and is mindful about privacy, I’d check https://valetudo.cloud/ first then see what harder supports it, which sadly doesn’t seem to support Roomba or Roborock AFAICT. It does, lucky you, check https://valetudo.cloud/pages/general/supported-robots.html#roborock
Edit: apparently “Xiaomi V1 is made by Roborock” according to https://valetudo.cloud/pages/general/supported-robots.html so maybe there is way, worth investigating for you IMHO.
turns out you can use older GPUs in creative ways to get a lot more out of them than people realized
If that’s the point then that’s the entire GPU used for mining then ML revolution, thanks to CUDA mostly, that already happened in 2010 so that’s even older, that’d 15 yeas ago.
What I was highlighting anyway is that it’s hard to trust an article where simple facts are wrong.
What is this… “Nvidia’s flagship RTX 3090 GPU”? Are we in back in 2020? Half a decade ago? Is this a joke? Even then, it wasn’t the flagship, the 3090 Ti was.
Is there a Murena/Volla of vacuuming robots? Namely can one buy a working robot (new or reconditioned) with Valetudo pre-installed?
Angry update.