Hacker Newsnew | past | comments | ask | show | jobs | submit | more high_byte's commentslogin

as I mentioned on another reply, all those projects also wanted system apis like filesystem and sockets and such.

for me I just want to hijack the interpreter so I don't have to write my own. no imports, no sockets.


No, I'm not.

I'm talking about the history beyond why rexec and Bastion, and restricted execution, were removed from Python in the 2.x days. See https://python.readthedocs.io/en/v2.7.2/library/restricted.h... , "In Python 2.3 these modules have been disabled due to various known and not readily fixable security holes."

They started because back in the 1.x days there was a Python web browser called Grail, and the hope was to support restricted Python applets in Grail.

Or from 10 years ago, read https://lwn.net/Articles/574215/ about the failure of 'pysandbox' where one of the ways to break out was to "[use] a traceback object to unwind the stack frame to one in the trusted namespace, then use the f_globals attribute to retrieve a global object." ... "Stinner's tale should serve as a cautionary one to anyone considering a CPython-based solution".

You might consider RestrictedPython at https://restrictedpython.readthedocs.io/en/latest/ which supports only a subset of Python, via AST-walking to limit what the code can do. I have no experience with it.


I didn't use RestrictPython. I did manage to patch the __subclasses__() escape with a hack. if only I can patch the exceptions traceback too I think it will be good enough :)

edit: here are my silly little patches: https://github.com/hananbeer/cpython-toy-sandbox/commit/fa3f...

this is of course assuming exec(globals={..}) without certain builtins and is, again, not expected to use system apis like files or sockets or anything.


As a reminder, in case you didn't consider it, some code in your exec string might be run after the exec has finished, due to garbage collection.

    d = {"__builtins__": {"print": print}}

    exec("""

    def delay_until_gc():
      try:
        try:
          yield 1
        finally:
          print((1).__class__.__bases__[0].__subclasses__()[:3])
      except:
        raise

    it = delay_until_gc()
    it.__next__()

    """, d, d)
    del d
    print("Finished exec.")
The output for this on my system is

  Finished exec.
  [<class 'type'>, <class 'async_generator'>, <class 'bytearray_iterator'>]
which means you'll need to ensure those dictionaries are cleared and garbage collected before you can clear your toybox state, something like:

  import gc
  toybox(1)
  exec(..., d, d)
  del d
  gc.collect()
  toybox(0)
The "del" is not good enough due to the cyclical reference as the iterator function's globals contain the active iterator.

If you allow any mutable object into the globals or locals dictionary, such that the exec'ed code can attach something to it, then you can't even use gc.collect() to ensure the exec'ed code can no longer be executed.


thanks, RestrictedPython looks like it could work for me!


I believe that's basically docker which uses linux seccomp, but there are also sandboxes for language specific applications.

ps. browsers basically do that with javascript


but all those projects also wanted system apis like filesystem and sockets and such.

for me I just want to hijack the interpreter so I don't have to write my own. no imports, no sockets.


these are all great I love that I can now just refer to the razors by name


I love my Sony XMs. I only ever use the noise cancelling on flights and I don't even use the bluetooth. just plugged and I love them.


why not? if you change one pixel by one pixel brightness unit it is perceptually the same.

for the record, I found liveportrait to be well within the uncanny valley. it looks great for ai generated avatars, but the difference is very perceptually noticeable on familiar faces. still it's great.


For one, it doesn't obey the transitive property like a truly lossless process should: unless it settles into a fixed point, a perceptually lossless copy of a copy of a copy, etc., will eventually become perceptually different. E.g., screenshot-of-screenshot chains, each of which visually resembles the previous one, but which altogether make the original content unreadable.


Perceptual closure under repeated iterations is just a stronger form of perceptual losslessness, then, after k generations instead of the usual k=1. What you’re describing is called generation loss, and there are in fact perceptually lossy image codecs that have essentially no generation loss; jpeg xl is one https://m.youtube.com/watch?v=FtSWpw7zNkI


GP is correct, that’s the definition of “lossy”. We don’t need to invent ever new marketing buzzwords for well-established technical concepts.


GP is incorrect.

There is "Is identical", "looks identical" and "has lost sufficient detail to clearly not be the original." - being able to differentiate between these three states is useful.


Importantly the first one is parameterless, but the second and third are parameterized by the audience. For example humans don't see colour very well, some animals have much better colour gamut, while some can't distinguish colour at all.


Perceptually lossless (nature for dogs) video compression at 15bit/s.


Lossless means "is identical".

The other two are variations of lossy.

Calling one of them "perceptually lossless" is cheating, to the disadvantage of algorithms that honestly advertise themselves as lossy while still achieving "looks identical" compression.


It's a well established term, though. It's been used in academic works for a long time (since at least 1970), and it's basically another term for the notion of "transparency" as it relates to data compression.


I honestly don't notice this anymore. Advertisers have been using such language since time immemorial, to the point it's pretty much a rule that an adjective with a qualifier means "not actually ${adjective}, but kind of like it in ${specific circumstances}". So "perceptually lossless" just means "not actually lossless, except you couldn't tell it from truly lossless just by looking".


But this marketing term has been regularly used in academic papers for nearly 50 years (or probably more), so it seems like it should get a pass IMO.

It's also used in the first paragraph of the Wikipedia article on the term "transparency" as it relates to data compression.


It is in no way the definition of lossy. It is a subset of lossy. Most lossy image/video compression has visible artifacting, putting it outside the subset.


Sony WH-1000XM4

and before you get discouraged by my story know that my first reaction to this incident was "damn these are my favorite headphones", I had them for years, travelled with them, use them practically daily and they still look brand new.

can't say I'm a big user of the noise cancellation or even ambient sounds or bluetooth. just aux in the laptop and on planes, it's amazing.

9.99/10 ears still ringing, would buy again.



seems to be becoming the norm... uber eats decided to just not show up. can't contact driver. can't contact uber. why offer support if you have a billion users? sounds like massive overhead. so lose a few users, it's cheaper.


(I am sure this has been studied to death ...

At what point, at what scale, (ie. 1bn. users) does it become humanly impossible to offer support? To scale offering "human" support, that is.-

At some point the "right to talk to a human" will become regulated ...


Or look at it from a different angle: at what scale (ie. 1bn. users) does it become possible to stop to offer support? Simply don’t care about those users with issues or questions.


if you enjoy finding & fixing bugs, you might enjoy bug bounties! try hacker one! https://www.hackerone.com/


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: