IMO that's the typical experience with many of the features in modern C++ standards. You read about a really neat useful thing they added, something that seems to provide a safe and practical way to overcome a shortcoming in the language. You may even get a little excited...until you try to actually use it and realize its full of new footguns and weird limitations
Yes, you read about std::variant on a blog and think that it is a sum type. Then you try it out and realize that it's a thin (type-safe) wrapper over tagged unions that is at least three times slower and has about 5 unreadable alternatives that replace simple switch statements.
Then you find out that members of a "variant" are not really variant members but just the individual types that can be assigned to a union. For example, assigning to a non-const reference does not work (and obviously cannot work once you realize that std::variant is just syntax sugar over a tagged union).
Most of these new additions since C++11 are just leaky abstractions and wrappers.
> and obviously cannot work once you realize that std::variant is just syntax sugar over a tagged union
It would be easy to make it work, there isn't necessarily a strict relation between the template parameter and the actual stored object. Not having reference variant members was a conscious decision, same as optional<T&>. Hopefully this will be fixed in the future.
A few code snippets of what you see as weaknesses of std::variant may be appropriate, as I couldn't figure out your complaint. Assigning to a variant taken by non-const& works fine for me.
I personally would have liked to see recursive variant types and multi-visitation (as supported by boost::variant).
std::variant is not a true algebraic data type, since the individual element constructors do not construct the variant type automatically. Compare to OCaml, written in a verbose and unidiomatic way that is similar to C++:
# type foo = Int of { n : int } | Float of { f : float };;
type foo = Int of { n : int; } | Float of { f : float; }
# Int { n = 10 };;
- : foo = Int {n = 10}
# let r = ref (Int { n = 10 });;
val r : foo ref = {contents = Int {n = 10}}
Notice that the constructor Int { n = 10 } automatically produces a foo type and assigning to a mutable ref works.
The same in C++, using assignment to a pointer to avoid the lvalue ref error that is irrelevant to this discussion:
#include <variant>
struct myint {
int n;
myint(int n) : n(n) {}
};
struct myfloat {
float f;
myfloat(float f) : f(f) {}
};
using foo = std::variant<myint, myfloat>;
int
main()
{
const foo& x = myint{10}; // works
foo *z = new myint{10}; // error: cannot convert ‘myint*’ to ‘foo*
}
As stated above, this obviously cannot work since C++ has no way of specifying a myint constructor that -- like in OCaml -- automatically produces the variant type foo.
C++ would need true algebraic data types with compiler support (that would hopefully be as fast as switch statements). To be useful, they would need a nice syntax and not some hypothetical abomination like:
using foo = std::variant<myint, myfloat> where
struct myint of foo { ... };
There is a difference between an API promissing that a value wont be null and a buggy program setting a null where it should not. A reference is only null if someone fucked up. As a programmer you can usually rely on a reference not being null and you couldn't do anything about it if it was anyway within the constraints of the language.
no, deferencing a null ptr is UB. An enum class outside the declared values is perfectly valid.
You could design a language feature where integer to enum is checked, but that's not enum.
Enum classes already add scoping, forbid implicit conversions and allow explicit underlying types. Those are pure extensions. Making undeclared values invalid or UB would be very surprising to people used to normal enums.
One of those cases happens accidentally all the time (in more complex variants than the motivating example you responded to), the other never happens except on purpose. It's like complaining guard rails are pointless because people being launched with catapults might still fly over them and plunge to their deaths.
I've never seen a nullptr somehow sneak into a reference. Never.
What I have seen is automatic variables escaping their scope as a reference, which rust protects against. And is also much more dangerous, because dereferencing a nullptr is defined behavior on most platforms
This doesn't really seem like their strategy anymore. It's not like Edge directly interprets Typescript, for example. While they embraced and extended Javascript, any extinguishing seems to be on the technical merits rather than corporate will.
In the case of security scanners that run in the kernel, we learned this weekend that a market need exists. The mainstream media blamed Crowdstrike's bugs on "Windows". Microsoft would likely like to wash its hands of future events of this class. Linux-like eBPF is a path forward for them that allows people to run the software they want (work-slowers like Crowdstrike) while isolating their reputation from this software.
I wonder if the amazing idea thatcreating value for the customer creates value for the shareholders will ever catch on? I call it trickle up economics.
There is a line of thought that the more value you give a customer the less your company will keep for itself and the less a shareholder will make. A volunteer gives away all their value to the recipient. An entrepreneur wants to collect money in exchange for value created. Far more money than they spent creating the value for the customer.
man, this made me a little sad to read and think about. and not because of your comment specifically: i know that this must be, generally, how one has to think and talk about economics any sort of academic or abstract sense, but... i guess i just haven't really done that very much
Another thing to think about that is also sad and just sorta becomes part of the background noise of existing in society is exploitation. If you ignore the emotionally charged connotations around the term and focus on its meaning
> make full use of and derive benefit from (a resource).
Capital is in the business of exploiting Labor. It is the only thing that makes sense in our economic system. For anyone doing labor for $X for a company that then sells it to a customer for $Y it must hold that X < Y or else the company will fail. The arrangement most people operate under, myself included, is one where you are willingly underselling yourself. The trade off that most people point to is that the people are willing to do this because the stability of getting $X all the time regardless of this weeks / months / years sales is worth foregoing $Y, essentially you are buying stability with the difference. Although the recent rash of layoffs for companies, regardless of how well they are doing, does offer a counterpoint to that theory.
It's actually a Tragedy of the Commons situation. If very few people think that way, everyone is better off and the society is a better place. But in such a place, the people that have this mindset will absolutely be more successful then if they didn't have it.
At this point we're decades into this flavor of capitalism and the profiteering has long since become the norm, sadly.
And before someone says "it's always been like that"... No, it hasn't. Even in pop culture from a few decades ago you've got the characters taking great pride in providing good products for great prices. Nowadays our culture mostly makes fun of people like that and they're portrayed as targets to be exploited...
I have always found it amusing how much of our modern economic system looks something like this.
- What if everyone shared and cooperated and got along?
- Well then someone that wanted to take advantage of people would come along and make a big mess of it!
- Well what's your proposal then, should we stop those people, outlaw their greed and avarice?
- Hmm... no I think we should reward it and build our entire system around getting greed to produce useful outcomes.
- Ok, so we've been doing that for a while and the people that would have taken advantage of the commons have privatized the commons, taken full advantage of that, purchased the government, and essentially rule over us while we all fight for crumbs...
- But you've got an iPhone, so it's basically working great
How is this creating value for shareholders? They wasted a bunch of engineer and pm salaries building useless products that never made (and now never will make) any significant money.
But that's not how shareholder value is created. Shareholder value is created by press releases and hype, encouraging new people to buy the stock and drive up the price, not by building anything useful to society or long-term profitable.
So if Google Fit drove a hype cycle, it was successful.
That's the great part: creating value for customers can pay out for shareholders as well in the long run. But it's probably not a viable option if you want quick returns.
This is the third case: creating value for employees. Google promotes people for making new things. Thereby guaranteeing product churn.
(edit: the limit case is of course Elon Musk arguing that he should simply be given the entire company treasury as payroll at Tesla, one single employee taking all the value produced)
Does anyone know how I would recreate this effect with 2 mp3s acting as 2 radio stations? I have tried fading in samples of static but it doesnt feel the same.
The traditional(?) version of that tool[1] could also maybe work if you are allowed to make a loop of the wires for twisting. The tool itself is just a hook on a angled axle that rotates in the handle. It's called 'surrauskoukku' in finnish, not sure about the english name