I was recently looking through the CPython source code and noticed something rather intriguing:
/* Minimal main program -- everything is loaded from the library */
#include "Python.h"
#include "pycore_pylifecycle.h"
#ifdef MS_WINDOWS
int
wmain(int argc, wchar_t **argv)
{
return Py_Main(argc, argv);
}
#else
int
main(int argc, char **argv)
{
return _Py_UnixMain(argc, argv);
}
#endif
The main function just calls a different function as is stated in the comment above
Minimal main program
And those two functions which were called (Py_Main
and _Py_UnixMain
) don't seem to do drastically different or very complex operations either, and they eventually call the same function (pymain_main
):
int
Py_Main(int argc, wchar_t **argv)
{
_PyMain pymain = _PyMain_INIT;
pymain.use_bytes_argv = 0;
pymain.argc = argc;
pymain.wchar_argv = argv;
return pymain_main(&pymain);
}
int
_Py_UnixMain(int argc, char **argv)
{
_PyMain pymain = _PyMain_INIT;
pymain.use_bytes_argv = 1;
pymain.argc = argc;
pymain.bytes_argv = argv;
return pymain_main(&pymain);
}
It appears that those operations could very easily be done in the main
or wmain
functions.
My question is if there is a clear benefit of this choice from a design and structure point of view? Why did the developers of CPython, probably, decide to effectively create a new main function instead of using the standard one? Does it make maintenance or debugging easier?
#ifdef
directive appears to be the motivation. They wanted to distingush between Windows and other operating systems.main
andwmain
?