Meta's vision was worse than that. They were trying to hype doing work meetings in VR. There's a case to be made that VR games and VR universes can be fun... But work meetings?
> There's a case to be made that VR games and VR universes can be fun... But work meetings?
If it's actual holograms like in Star Wars? Sure, why not. Get the visual and body language cues of the rest of the room but no one has to physically congregate at a location.
I mean.. yeah. That's why no "Facebook like 2007" social networks have happened since. It would have to be run by someone with a lot of built up trust in order for people to trust it in the way that people did with Facebook. The world has been burnt.
If say, Valve started a social network I would consider using it because they've had decades to screw up Steam and they haven't. It is a bit outside their wheelhouse though.
Yeah that's one of those quotes that has done more harm than good. It's roughly equivalent to the fallacy of the grey - some things are crap so everything is equally crap.
In reality nobody (sane) would claim that Bugzilla is better than Phabricator for example. Some project management tools are better than others.
He's right, though. C++ exists because it was compatible with C code. Not having to rewrite into Java was a major advantage and justifies most of the terrible design decisions made as a result.
Interesting, I think you're interpreting the quote in a different way to me.
It it "People complain about C++ and not Foo because even though Foo is better, C++ had to be crap in order for people to use it."
Or "People complain about C++ and not Foo because although they are equally crap, nobody uses Foo so there's nobody to complain about it."
I always thought it was the latter. Unfortunately I can't find much more on where he actually said this than his own quotes page which also doesn't really clarify anything: https://www.stroustrup.com/quotes.html
Speaking as a C++ developer on GPUs, it's mostly the first.
There's so many problems that no other language has due to the backwards compatibility guarantees Bjarne made to get companies using the language.
For example, C guarantees ABI compatibility. Object files from an older compiler/standard can be linked with newer versions of the compiler/standard. This is great if you want to distribute a proprietary library to end-users without revealing source code (libcuda & legacy enterprise tools).
But this guarantee applied to C++ means a templated function in the standard library can almost never change the implementation.
As a result of extending the guarantee to templates, the C++ standard library must create new classes and functions then deprecate the old ones, meaning they can't do basic optimizations to older code. A big complaint is "too many ways to do the same thing".
Rust doesn't make this guarantee.
That allows them to make a cleaner language with less duplication, but mostly forces Rust crates to be open-source.
Nvidia shipping libcuda.so in Rust means having to upgrade Rust constantly and potentially forcing customers to, or releasing the CUDA source code which hurts their competitive advantage.
Bjarne could've designed a number of OOP languages in the 70s and 80s. Smalltalk, Eiffel, and Simula, were better languages. But he built C with classes which is the reason it's used today.
> but mostly forces Rust crates to be open-source.
In practice how many closed source C++ libraries are there? I can't recall using a single one despite working on many closed source C++ applications. I'm sure they exist but a closed source Rust crate could just do what many C++ libraries do anyway - wrap a C ABI with a thin open source Rust shim.
> which is the reason it's used today.
Obviously it helped that C++ was backwards compatible with C, but all of those languages you mentioned use garbage collection and need some kind of runtime so they were never in the running for a systems programming language like C++. (I know "systems programming" is ambiguous but you know what I mean...)
I totally agree that the requirement of backwards compatibility with C made C++ a lot shitter. The most obvious examples are the automatic type coersion and the insane type syntax.
at least bugzilla is actively maintained. Abandonware over something Mozilla, Red Hat, Apache, GNOME, and KDE still run production workflows on isn't an obviously sane choice
This is presumably what Apple does. You kind of have to anyway or you have the stupid situation Linux used to have where only one app could play audio at a time.
> you have the stupid situation Linux used to have where only one app could play audio at a time
When was that? I think my first Linux distribution was Ubuntu 8.04 and fairly sure it shipped with PulseAudio which in mind always been able to play audio from multiple sources at the same time, maybe I misremember?
As I recall it was rarely enabled by default and was a pain to set up so in practice not really used.
The most common solution at the time was PulseAudio, which was so bad it usually was better to just use direct ALSA and live with the idiotic one-at-a-time limitation.
Thankfully Pipewire seems to actually work reliably so I guess that's at least one thing ticked off the Year of the Linux Desktop checklist.
Most distributions shipped ALSA preconfigured with dmix, which means multiple applications could play sound at the same time just fine.
Which is why the whole "we must use pulseaudio even if it's terrible and has awful standards that blast volume or multiple streams won't work!" was so weird… everybody who tried knew that just removing pulseaudio the multiple streams kept working :)
So only those who never applied the scientific method kept insisting that without PA it was not possible to do that.
I think PA allows for setting applications volumes and have a modular design. But it's kinda the poster child of overengineering (challenged by systemd now). Something like sndiod is more sensible for most desktop distro. People that need a more complex setup can bring in the big gun like pipewire.
I don't think the problem was over-engineering. I think the problem was that if you plugged in headphones it would instantly set the volume to 100% from whatever value it was before.
Plus of course, initially you had to regularly run killall -9 pulseaudio to fix the sound. All in a moment when ALSA with dmix worked just fine.
Sometimes I think fedora and ubuntu are trying to hinder linux as mainstream desktop.
Even back then, it could play more than one stream. You had to have a sound card or kernel drivers that supported it (and all non-obsolete ones did by the time pulse audio came out).
I still don’t know what purpose pulseaudio serves, other than adding latency and making stuff less reliable.
PipeWire is better, but it turns out you can just use OSS under freebsd these days, and everything just works, but with lower latency.
If you have some sort of potato sound card that can’t mix output channels in hardware, note that OSS added sw mixing by 2007 (with support for 16 channels by default).
Sure, sure. I remember a time when I didn't have a sound card that supported it and couldn't play multiple streams at a time; this is a thing that really happened. I did eventually go out and buy a soundcard to enable multiple streams.
Nonsense - HDA systems were overwhelmingly the majority of Linux systems at that point, and didn't have any hardware support for multiple streams. OSS with software mixing was a commercial product that wasn't upstream. ALSA had userspace mixing but it was very much not an out of the box experience, and didn't take advantage of hardware capabilities in the way Pulseaudio did to reduce wakeups and power consumption.
Even so, surely it would have been easier and better to just fix or replace dmix (in kernel, in the existing data path) than introduce a userspace daemon, break API compatibility, and so on.
It’s been 20 years and pulseaudio is still flaky / high latency / incomprehensible. Professional flows that care use stuff like jack.
PipeWire replaced Pulse like five years ago; who is using Pulse at this point to make statements like "20 years" meaningful? It isn't really an ongoing concern.
Doing audio mixing well is something that is, for a number of reasons, hard to do in kernel. And if you're still using pulseaudio, why? The rest of the world's moved to pipewire, which also provides a jack-compatible interface.
TBH pipewire works much better than pulse, up to the point to replacing jack itself. But DMIX worked fine for non-professional user needs and with very low CPU usage. Yes, it was Jackd for the professional but Windows had ASIO drivers too.
This is the era where I was the lead on Ubuntu laptop support, and I promise you that dmix was not a trivial option to make things work out of the box.
I always had some Knoppix live CD/DVD which had better defaults than Ubuntu itself on hardware autodetection and setup. I think they used kudzu from RH for a good while plus custom patches.
Bear in mind the Knoppix creator had a blind wife, up to the point to creating A.R.I.A.N.E, one of the best distros for the blind (and it was merged with main KNOPPIX, making the distro one of the best accesible ones out there). Thus, proper audio mixing was mandatory.
With the bundled installer you could install it to as a Debian Testing install in the spot. As I didn't have internet at home, I remember using Knoppix before Debian Sarge because it had a huge amount of things to play and test without worrying about odd hardware setups.
Some of the context here is that that at the time, Ubuntu was aiming to work on as close to 100% of existing PCs as possible to make it available to the largest number of users. Knoppix had a lot of great features and also was very opinionated, and that had an influence on the set of hardware it worked well on by default. I evaluated basically every decision made there in terms of whether Ubuntu should adopt the same ones, and there were several that were just not good choices in terms of supporting the widest set of hardware possible.
If you have two audio streams, you can't play them as is on the audio device, you have to mix them together. The same happens with analog speakers as you can't just add two signals together. I believe at one point with Alsa, when an application takes control of the audio device, no one else could play with it. Now Alsa comes with dmix (a digital mixer feature) enabled in its default configuration, so two applications may play how they want. And we have PulseAudio, Jack, and Pipewire on top of Alsa to add more features.
OpenBSD still present raw audio devices, but they have sndio which provides a more helpful interface for applications including resampling (not the best algorithms there, according to them).
Technically yes, but the batteries on bikes are better protected, higher up, and usually of better quality than cheap scooter batteries, that are low to the ground. It's more of a statistical decision.
I don't think that is actually true - upgrading by selling your old laptop and buying a new one is still going to be cheaper than Framework's upgrades. I wish it wasn't but it is.
Not necessarily true, and not necessarily even a option sometimes.
In my case for example, the ThinkPad Z13 was the best laptop for me at the time. My main criteria generally is a top-tier AMD APU, first-class Linux support, machined aluminium body, 12-14" display. Only the ThinkPad Z13 met my criteria at the time, but there wasn't any in stock in my country. So my only option was to import it, and it was pretty expensive with all the import duties. A Framework would've been cheaper, but they didn't ship to my country and they also didn't have an aluminum chassis.
It was worth it though, I'm still using the ThinkPad and it's hands-down one of the best laptop+Linux experiences I've ever had.
It's now time for an upgrade. Unfortunately, Lenovo killed the Z13 line with no equivalent replacement, and everything else on the market currently has some or the other limitation (for me) - the most common limitation being lack of official Linux support.
Thankfully, Framework finally started shipping to my country this year, and their laptops also finally meet all my requirements (the previous models were all lacking the machined aluminum chassis). This also means that I may have found my "final" laptop at last, assuming Framework continues to deliver on their promise. Never again do I have to go hunting around for a Linux-first machined-aluminium top-tier 13" AMD laptop. And the best part is, I can just swap out the mainboard the next time I want to upgrade - which I guarantee will be way cheaper than buying a laptop that meets all my requirements, assuming it even exists.
But if you're not like me and you're fine with any random el cheapo Windows laptop, then yeah Framework is probably not for you.
I dunno what he's referring to but the big changes for me for Facebook were:
1. They let non-uni students on there. Suddenly my crazy aunt was sending me a friend request and they had no option for "be polite but I don't want to have anything to do with her". That diluted the amount of stuff I wanted to see.
2. They allowed sharing posts. This just gradually turned it into shit Reddit.
I expect they're just banking on getting their investment back with some fat returns by licensing it to the NSA to decrypt their hoovered up encrypted coms, with their data storage now reaching up to the yottabyte level. That's a lotta byte.
reply