Sorry to ruin your ruining, but if you read past the abstract and look at the data, you'll see it tends to correlate with whether a democrat or republican is in office. Immigration policy is also mentioned in the discussion.
> Given these findings, a corollary question is what attracts foreign graduate students to the US and leads them to stay. Prior research points to immigration policy—a subject of perennial public interest—having a large effect on stay rates
My understanding is that grok api is way different than the grok x bot. Which of course does Grok as a business any favors. Personally, I do not engage with either.
Grok is good for up-to-the-minute information, and for requests that other chat services refuse to entertain, like requests for instructions on how to physically disable the cellular modem in your car.
I sat in my kid's extracurricular a couple months ago and had an FBI agent tell me that Grok was the most trustworthy based on "studies," so that's what she had for her office.
It's excellent, and it doesn't get into the weird ideological ruts and refusals other bots do.
Grok's search and chat is better than the other platforms, but not $300/month better, ChatGPT seems to be the best no rate limits pro class bot. If Grok 5 is a similar leap in capabilities as 3 to 4, then I might pay the extra $100 a month. The "right wing Elon sycophant" thing is a meme based on hiccups with the public facing twitter bot. The app, api, and web bot are just generally very good, and do a much better job at neutrality and counterfactuals and not refusing over weird moralistic nonsense.
I didn't find 2 surprising either, but I'm a little surprised you never see it. If you want to treat the args to a function as immutable, what can you do besides copy, modify, and return a new object?
> what can you do besides copy, modify, and return a new object?
You can directly produce a modified copy, rather than using a mutating operation to implement the modifications.
It should be noted that "return a modified copy" algorithms can be much more efficient than "mutate the existing data" ones. For example, consider the case of removing multiple elements from a list, specified by a predicate. The version of this code that treats the input as immutable, producing a modified copy, can perform a single pass:
def without(source, predicate):
return [e for e in source if not predicate(e)]
whereas mutating code can easily end up with quadratic runtime — and also be difficult to get right:
def remove_which(source, predicate):
i = 0
while i < len(source):
if predicate(source[i]):
# Each deletion requires O(n) elements to shift position.
del source[i]
else:
# The index increment must be conditional,
# since removing an element shifts the next one
# and that shifted element must also be considered.
i += 1
Yes, you can do this if you don't care about order, and avoid the performance degradation. But it's even more complex.
Or if you do care about order, you can emulate the C++ "erase-remove" idiom, by keeping track of separate "read" and "write" positions in the source, iterating until "read" reaches the end, and only incrementing "write" for elements that are kept; and then doing a single `del` of a slice at the end. But this, too, is complex to write, and very much the sort of thing one chooses Python in order to avoid. And you do all that work, in essence, just to emulate what the list comprehension does but in-place.
You can do the in place variant using generator comprehension and writing the result in place in the original vector. The generator should run ahead of the write and should work fine.
It will also probably be significantly slower than just copying the vector.
I hadn't heard of the author before this. I'll definitely read more of their stuff, but I thought the bottom line for part three was a little incomplete.
> Bottom line: the more uncertainty, indeterminacy, ambiguity in your game, the more depth it will have.
Sure, starting from 0%, adding uncertainty adds depth. But the player needs to maintain some influence over that uncertainty. If you crank the uncertainty up too 100% then its pure random which isn't deep or fun.
I've noticed a similar more-is-better trend in a few sequels I've played, where the first game had say 5 mechanics which were fun. Then the sequel has 10 mechanics, and because 10 is more than 5 it therefore must be more fun. But it ends up being too much shit to juggle and less fun as a result.
In a game design context, he is definitely using "uncertainty" in a wider sense, as popularized by Greg Costikyan's Uncertainty in Games book.
In that sense of the word, it's not only about random things, but also things like "will I click at just the right time to head-shot that enemy?" or "I will checkmate the next turn unless my opponent thinks of some clever move that I don't?"). And the theory is that once you run out of uncertain things there is no more a game, as the player know how it will end and there is nothing more that can fail or anything unexpected that can happen. Basically like reading the end of a book you have already read before, so you know exactly what will happen.
And depth from a game design pov is also not necessarily strictly positive. Make the game too deep and there is, as you say, pure random. You could keep adding rules to chess to make it 100% impossible for any human to remotely guess what kind of move to make, and that's when you added so much uncertainty that it became too deep.
There's been quite a few games in recent years where I notice some system and think "ugh, do I really need to bother with this, too?". Especially crafting or skill point systems which feel slapped on. Some games make them a fun and integral part of the gameplay, some seem to include them because it's trendy and it just adds friction and mental load with little payoff.
I don't mind complexity, some of my favorite games are ridiculously complex (Dwarf Fortress), but the complexity needs to pay for itself.
I’ve had similar thoughts too: the older I get, the less “extra features” translate to value if I’m expected to stretch my concentration across all of them to have fun.
I’m not as sophisticated as the average Dwarf Fortress player, but an emergent quality of that game that I’ve admired from afar has been how you can ignore various mechanics and you’re rewarded with an interesting ride.
It’s dynamic enough that by pulling various gameplay “levers” you can get wildly different outcomes (and thus value through replayability), but things will sort of run themselves (for better or worse) if you forget about them. So you’re half writing your own story, half discovering it as it writes itself.
My cynical take is that crafting systems are probably the most attractive on the ratio of "amount of dev effort required to implement" relative to "amount of play time added." They're also trivially tunable. You can add (or subtract) hours of play time just by changing the numbers required to craft things.
Unless they're an integral feature of the game (like in Minecraft), they always feel slapped on to me.
Remember, it's about prediction (point 1 of the 12). Pure random cannot be predicted. From a prediction point of view, it is therefore ironically, an already determined result. So it is solved, and therefore not interesting.
In Theory of Fun, I phrased this as "everything has patterns, but if you are not equipped to see the pattern, it becomes noise, and therefore boring."
Yeah, you need to strike a balance. Maybe ambiguity is a better way to look at it than uncertainty or randomness; chess is fun, but the only random factor are the whims of your opponent. There's no randomness, but there is ambiguity about what their strategy is, and whether they're seeing something that you're missing.
An extreme example of more-is-better are games like EU4, where just understanding how trade works, is more complicated than most entire games, and that's just a single subsystem. You can ignore it, but mastering it can be satisfying. Or frustrating.
It also matters a lot what type of uncertainty a game has, and what the curve of learning to manage it is.
E.g. slight variations in inputs should produce a slight but ideally meaningful variation in output, so the outcome of pressing keys is both reliable as well as an open space for further mastery.
It's also important that you can trace and understand what happened in retrospect. Just missing because of a 5% chance isn't fun. Missing because you didn't consider wind direction and the movement of an object between you and the target on the other hand is perfectly grokkable.
In some sense though, 100% randomness is meta-predictable: something happens that I can’t predict. There’s a lot less tension. Idk where in the middle is the best spot, I guess that’s where the artistry is
It's like an image, you want neither a single solid colour nor perfect noise, but something in-between with identifiable features, highs and lows. When it changes unexpectedly it should change into something new and exciting, not more noise.
The title contradicts the article. Our eyes are averaging information that is up to 15 seconds old, but we can obviously react to visual stimuli in less than 15 seconds so it’s not all that old. They’re probably also confounding the research by using faces as the thing that changes over time, since our brains are sort of hardwired to respond to faces differently than other stuff
Meanwhile the local “eco friendly” pesticide company is knocking on everyone’s door, showing them pictures of scary looking bugs, offering to treat their entire yard from the foundation to the pavement. I don’t get it
> It’s very easy. Insects are annoying. Nobody wants them near.
Insects are an enormous class. (Well, I had to look that up, and maybe insects are just part of the class Insecta? Anyway, they're an enormous whatever they are.) I'm not a big fan of mosquitoes or ticks, and I try to shoo wasps away from building nests on my porch, but, for example, bees and butterflies are marvellous creatures whom I welcome to my garden and enjoy watching and having all around me.
OSH Park is pretty great for prototyping. $5-20 / square inch depending on the board type. I remember running into some constraint limits years ago but they may have improved since then.
Flies are pollinators in the US too, but the second to last paragraph is a little worrisome
> Another important point is that we have very good evidence of range declines and even extinctions among wild pollinators, which aren’t being managed by anyone.
Yea. Thats big. You'd think the farming lobby was on this one. Not wanting to mis-characterise them, I think a lot of farmers are bigly MAGA so you'd think defunding a program to keep them in production was a candidate for no: fund this one, right?
What were the results of your study? I’ve heard that bat lungs are so sensitive that when they fly across the pressure differential of large turbines their capillaries basically explode
Yes basically. Bird lungs are relatively rigid, open at both ends like a tube, and have a one-way flow of air, so they are less prone to pressure-related injuries. Bat lungs are mammalian lungs that expand and contract as they breathe just like us, so they are particularly vulnerable to barotrauma near wind turbines.
After writing a bunch of MATLAB code to find the bats, I handed it off and haven't heard back about whether they actually built the wind turbines or not.
> Given these findings, a corollary question is what attracts foreign graduate students to the US and leads them to stay. Prior research points to immigration policy—a subject of perennial public interest—having a large effect on stay rates