Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> There was also a large amount of concern from the attendees about the impact the introduction of nogil could have on CPython development. Some worried that introducing nogil mode could mean that the number of tests run in CI would have to double. Others worried that the maintenance burden would significantly increase if two separate versions of CPython were supported simultaneously

The 2021 workstation I am typing this on has 16 physical and 32 virtual cores, and I expect core counts to continue upward this decade. While the CPython core devs may be excellent programmers and do a good job at maintaining Python, they clearly have a bit of trouble with cost/benefit analysis if the complaint is that the test count will double in CI for a ~16X increase in the amount of code that can be run in parallel on even consumer machines. Yes, yes, I know that this does not mean my programs will run ~16X as fast. Yes, I know there are other objections. But this is an order of magnitude away from being an actual blocker and the fact it was brought up at all as an objection shows a fundamental disconnect between the small potential costs imposed on core devs, and the huge potential upsides for Python users everywhere.



> they clearly have a bit of trouble with cost/benefit analysis if the complaint is that the test count will double in CI for a ~16X increase in the amount of code that can be run in parallel on even consumer machines

You can already use multiple cores by writing C extensions that release the GIL and with multiprocessing. That double CI cost and additional work isn't just for core python but for the entire ecosystem.


>You can already use multiple cores by writing C extensions that release the GIL

I chose Python to not have to write C...


Let’s not forget that writing C extensions is the easy part.

The even more sucky part is to distribute them and make sure it works on every OS, without every user having to apt install build-utils before they can pip install your package and then spend rest of the day debugging some rare compilation error because of a header file mismatch with what’s installed on the system. The python packaging space is already complicated enough even without considering native modules.


The amount of additional complexity from a C extension is dramatic in any sort of Python application. and the performance hit from having to pickle all objects that you would like to share between processes when using multi-processing is significantly non-trivial.

True shared memory threads within a single process would be a major boon.


a) This only works for some applications and a very bad fit which does nothing for others. Namely, this only works if 1) you have few hot code locations 2) that code uses data structures which can be feasibly mogrified into native data structures with Python accessors 3) the granularity is low, meaning few invocations doing lots of work each. If any of these conditions aren't met, "native extensions invoked by Python threads" tends to be ineffective but maintenance-intensive.

b) Introducing native extensions means deployment and distribution becomes more difficult, and introduces a whole new and large class of issues and caveats into a project.

c) Native extensions are not written in Python. (Yes, Cython exists, no, it's usually not a good idea to write more than glue in it).


> You can already use multiple cores by writing C extensions

Don’t use Python is also my preferred recommendation when I encounter Python.


Thanks but:

1. Learning a new language is non-trivial for many people (and don't sneer - it's about time not competency)

2. The ecosystem matters. If the code you want to interface with is in Python then "don't use Python" is just glib.

I'm mainly proficient in Javascript, Python and C#. My choice of language is rarely based on "which is best for this task?" but mostly "want do I need to run on and interface with?"


I think the person you replied to agrees with you, and was just being snark


regarding point 1 if we are already at the point that you have to write a C extension to workaround language lossage, we can already assume you know C. At that point it is just easier to rewrite in another language that dealing with FFI.


Or just don't write Python. There are a million languages without a GIL.


This is exactly what Perl devs said… Or is your comment sarcasm?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: