I'm talking about the history beyond why rexec and Bastion, and restricted execution, were removed from Python in the 2.x days. See https://python.readthedocs.io/en/v2.7.2/library/restricted.h... , "In Python 2.3 these modules have been disabled due to various known and not readily fixable security holes."
They started because back in the 1.x days there was a Python web browser called Grail, and the hope was to support restricted Python applets in Grail.
Or from 10 years ago, read https://lwn.net/Articles/574215/ about the failure of 'pysandbox' where one of the ways to break out was to "[use] a traceback object to unwind the stack frame to one in the trusted namespace, then use the f_globals attribute to retrieve a global object." ... "Stinner's tale should serve as a cautionary one to anyone considering a CPython-based solution".
You might consider RestrictedPython at https://restrictedpython.readthedocs.io/en/latest/ which supports only a subset of Python, via AST-walking to limit what the code can do. I have no experience with it.
I didn't use RestrictPython.
I did manage to patch the __subclasses__() escape with a hack.
if only I can patch the exceptions traceback too I think it will be good enough :)
this is of course assuming exec(globals={..}) without certain builtins and is, again, not expected to use system apis like files or sockets or anything.
which means you'll need to ensure those dictionaries are cleared and garbage collected before you can clear your toybox state, something like:
import gc
toybox(1)
exec(..., d, d)
del d
gc.collect()
toybox(0)
The "del" is not good enough due to the cyclical reference as the iterator function's globals contain the active iterator.
If you allow any mutable object into the globals or locals dictionary, such that the exec'ed code can attach something to it, then you can't even use gc.collect() to ensure the exec'ed code can no longer be executed.
why not? if you change one pixel by one pixel brightness unit it is perceptually the same.
for the record, I found liveportrait to be well within the uncanny valley. it looks great for ai generated avatars, but the difference is very perceptually noticeable on familiar faces. still it's great.
For one, it doesn't obey the transitive property like a truly lossless process should: unless it settles into a fixed point, a perceptually lossless copy of a copy of a copy, etc., will eventually become perceptually different. E.g., screenshot-of-screenshot chains, each of which visually resembles the previous one, but which altogether make the original content unreadable.
Perceptual closure under repeated iterations is just a stronger form of perceptual losslessness, then, after k generations instead of the usual k=1. What you’re describing is called generation loss, and there are in fact perceptually lossy image codecs that have essentially no generation loss; jpeg xl is one https://m.youtube.com/watch?v=FtSWpw7zNkI
There is "Is identical", "looks identical" and "has lost sufficient detail to clearly not be the original." - being able to differentiate between these three states is useful.
Importantly the first one is parameterless, but the second and third are parameterized by the audience. For example humans don't see colour very well, some animals have much better colour gamut, while some can't distinguish colour at all.
Calling one of them "perceptually lossless" is cheating, to the disadvantage of algorithms that honestly advertise themselves as lossy while still achieving "looks identical" compression.
It's a well established term, though. It's been used in academic works for a long time (since at least 1970), and it's basically another term for the notion of "transparency" as it relates to data compression.
I honestly don't notice this anymore. Advertisers have been using such language since time immemorial, to the point it's pretty much a rule that an adjective with a qualifier means "not actually ${adjective}, but kind of like it in ${specific circumstances}". So "perceptually lossless" just means "not actually lossless, except you couldn't tell it from truly lossless just by looking".
It is in no way the definition of lossy. It is a subset of lossy. Most lossy image/video compression has visible artifacting, putting it outside the subset.
and before you get discouraged by my story know that my first reaction to this incident was "damn these are my favorite headphones", I had them for years, travelled with them, use them practically daily and they still look brand new.
can't say I'm a big user of the noise cancellation or even ambient sounds or bluetooth. just aux in the laptop and on planes, it's amazing.
seems to be becoming the norm...
uber eats decided to just not show up. can't contact driver. can't contact uber.
why offer support if you have a billion users? sounds like massive overhead. so lose a few users, it's cheaper.
Or look at it from a different angle: at what scale (ie. 1bn. users) does it become possible to stop to offer support? Simply don’t care about those users with issues or questions.
for me I just want to hijack the interpreter so I don't have to write my own. no imports, no sockets.