Thanks for writing down the history and reasoning.
the fact that these functons exist prevent Python to evolve.
That’s a legitimate concern, sure, but I don’t think it needs to be solved by proactively removing all problematic API.
If this is “private” API, and we are allowed to remove it without a deprecation period, then IMO we should do that when it starts causing trouble.
Some of the functions you’ve removed are unlikely to cause trouble.
Moreover, it’s a big burden for other Python implementations like PyPy, since they actually have to implement private functions as soon as they are actually used by C extensions.
This is where cpython-compat can help, by providing implementations that rely on public API.
Also, it’s an inconvenience for a limited number of well-maintained projects (PyPy, HPy), which have largely solved this already. We can help them by making sure we don’t add new questionable API, but removing what they already worked around doesn’t seem too useful.
Sometimes, when I see a private function, I don’t know its purpose, I don’t know how it’s used, I don’t know how it’s supposed to behave. It costs me the “Chesterton’s fence” maintenance burden: it takes me more time to think about such private API, compared to when I meet a public API (well defined, documented, tested, backward compatibility warranties).
And so, you take the most drastic action available – removing the API entirely?
I don’t understand.
My goal [is to] clarify the distinction between public and private APIs
I would like to make the C API smaller
That is a good goal, but I don’t think you need to remove API to get there.
The underscore is already a clear marker. So is Py_DEPRECATED. We can combine them. Another idea that was floated around was to add a macro to disable everything that’s discouraged in 2023. But, again, you’re taking the most drastic option available.
I suggest making decisions on a case by case basis
Yes, that would be great.
If I disagree with your decisions, how should I react?
When a new C API is added, an implementation for Python 3.12 can be added to the pythoncapi-compat project.
IMO, that’s a great use case for pythoncapi-compat. It allows one to use the lates t and greatest API even on older Python versions, if you want to.
However, I don’t think the existence of pythoncapi-compat should justify removing old API.
A C library is not an easy dependency to add (and keep up to date). And pythoncapi-compat also needs tests and docs – isn’t the maintenance burden similar to CPython?
Most old API works. It might be inefficient, or use an older naming convention, or be difficult to use correctly, or not be thread-safe, or have weird edge cases, or be a no-op, but if someone uses it despite the shortcomings, I don’t think CPython should force them to rewrite code just because we found a slightly better way of doing things.
Of course not all old API is like that. But most is, IMO.
I think Python breaks too much. The 3.11 update was painful, 3.12 is not much better, and 3.13 is shaping up to follow the trend. Each breakage we make is a reason for someone to discontinue a working library, or abandon Python altogether. Breakage is hurting the project.
Practicality should beat purity. It’s harder that way, but I think not breaking users unless necessary should be much, much higher on our list of priorities. Just because PEP-387 says an API can change without notice doesn’t mean it should be removed ASAP.
Can we find a way to mark old API as discouraged, but keep users’ code working as long as possible?
Can we limit the breakage to API that needs to change to support new optimizations and features?