How do I run Python script from a subdirectory?

18,697

Solution 1

How to call from Python another Python script

You want to call a Python script, which works well only if run from particular directory as it has some imports which work well only from given location.

Assuming there is a subdirectory named "script_dir" and there is the "local_script_name.py" file:

import subprocess
subprocess.call(["python", "local_script_name.py"], cwd="script_dir")

If your script accepts command line arguments "arg1", "arg2", "-etc"

import subprocess
subprocess.call(["python", "local_script_name.py", "arg1", "arg2", "-etc"], cwd="script_dir")

Solution 2

sys.path.append(os.getcwd()) work for me. Now you can run commands like ./manage/my_cli_command.py

Solution 3

How to write python scripts efficiently

There are multiple ways to write Python script (a script, which is supposed to be called from command line).

For flexible use, forget about manipulating PYTHONPATH - such solutions are difficult to maintain and re rather shaky.

Script with Python stdlib imports only

Betting purely on packages and modules, which are part of Python stdlib makes the script easy to run from anywhere.

Extension to this is importing modules and packages, which are globally installed. Globally installing packages is mostly considered bad practice, as it spoils global Python environment. I have globally installed just small set of packages, which I use often: pyyaml, docopt, lxml, jinja2.

Script importing any installed package

Packages and modules, even your own, can be installed by means of setup.py. It took me a while to get used to this, I was initially ignoring setup.py as too complex solution, but later on I used to use it for any home-made package which I need to import in my programs.

Importing packages installed into virtualenv

This is by far my most popular solution now.

You will need your packages having setup.py.

After virtualenv is created, you use pip to install all the packages you need.

With given virtualenv being active, you can than simply use the script, which will see the packages available for import regardless of where you are your script starting from.

Script installed by means of setup.py

Again, this solution seems first too complex "just for simple script", but finally it can become the simplest method.

You have to create simple Python project with your script and setup.py only. The setup.py shall declare the script for installing. There are two methods, either by scripts argument, or by entry_points, later one being more flexible - use that one.

The script is best installed from within virtualenv - after the installation is complete, find location of the script and move it whenever you want to use it. For using the script, there is no need to activate the virtualenv, it is enough to have it in PATH. The script inside includes refernces to Python from given virtualenv and this ensures, it uses it including all installed packages.

Note, that the best solution for having required packages installed for your script is mentioning it in install_requires argument in setup.py, it ensures, they get installed with your script automatically.

Using zc.buildout

zc.buildout used to be my favourite solution until pip and virtualenv became mature solutions.

You have to create buildout.cfg file and there you define all what is needed for your script. this moslty includes having the setup.py in place, but this is not always necessary.

Personally, I found zc.buildout powerfull, but rather difficult to learn and I would recommend using virtualenv solution.

Using other solutions like PyInstaller

There are other solution, which allow turning Python script into single executable file.

PyInstaller looks like good way to go. There might be other similar solutions.

Conclusions

Forget about making your modules and packages importable by manipulating PYTHONPATH, such solutions are quite shaky.

My recommendation is to go for project having setup.py using entry_points to install your script.

Installation into virtualenv seems to be the cleanest method.

Setting up this environment (the project) the first time takes some effort, but after that, you can simply copy and adopt the solution and it will become natural way to go.

Share:
18,697
Blairg23
Author by

Blairg23

I am a software engineer who loves Python, automation, data analytics, data visualization, intelligence gathering, simulations, web application programming, solving advanced puzzles, and learning new things every day! Some of my favorite graduate projects have been modeling populations (epidemics, paramecium, and whales), physics simulations (Per Bak self-organized criticality, ice sheets, bullet trajectories, and particle interactions), computational biology (deterministic) modeling, and artificial intelligence (machine learning, pattern recognition, and data-mining). I completed a graduate portfolio as part of my Master's graduate requirements back in 2014. I can't believe it's been 5 years since I last updated this spiel and I still haven't had time to update my portfolio! I just stay too busy working! For the last couple years (2017-2019), I've been developing a web application for TOMIS (https://tomis.tech) and managing the engineering team, DevOps, and Quality Assurance. Built in Python/Django on the backend and React.js on the frontend, the goal was to automate marketing tasks; provide an aggregated visualization of analytics, seo, email and social campaigns, and reservation booking data; and provide beautiful monthly reports. The engineering involved automating distributed tasks (Celery, Redis), custom data ETL workflows, 3rd party API integrations (Google APIs, reservation systems, WebCEO, Twilio SMS/SendGrid, and Mailchimp), project management (Atlassian Jira, Confluence, and Bitbucket), DevOps (Pipelines, GCP), SRE/APM (DataDog, Honeycomb, Stackdriver), and handling incident comms and post-mortems to end-users (Statuspage.io). My non-technical hobbies include photography, snowboarding, hunting, fishing, hiking, swimming, floating, biking/longboarding with my three dogs, and exploring Montana. I also enjoy anything that involves strategy and is thought-provoking. I love Super Smash Bros., chess, cribbage, poker and a ton of other card and board games!

Updated on June 09, 2022

Comments

  • Blairg23
    Blairg23 almost 2 years

    I want to be able to write a line of code like this and have it run smoothly:

    python /path/to/python_file.py -arg1 -arg2 -etc
    

    I figured out an easy way to discover all modules and add them to the current Python path, but it still doesn't seem to recognize the .py file, even though it's supposedly in the sys.path. I know the sys.path addition is working because I can perform this in the interpreter just fine:

    >>>import ModuleManager # My Python script to discover modules in lower directories
    >>>import testModule # Module I want to run from lower directory
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
    ImportError: No module named testModule
    
    >>>ModuleManager.discoverModules() # Now, I find the modules and add them to path
    Discovering Modules...Complete!
    Discovered 6 Modules.
    >>>import testModule # No error now.
    >>>
    

    What I would like to do at this point is be able to go into my terminal and say:

    python testModule -arg1 -arg2 -etc
    

    and have it perform how I would expect.

    To be clear, I want to write a line of code in ModuleManager.py (from my application root folder) that allows me to access a file named testModule.py which is found in /root/path/to/testModule.py in a way such that I can use arguments as in python testModule.py -arg1 -arg2 -etc. Any suggestions?

  • Blairg23
    Blairg23 almost 10 years
    I need this to be able to run from a folder that will be moved to different computers every time. It is to run simple test scripts, but I have my test scripts in different folders and need to be run as terminal commands with command line arguments. I don't want to have to install anything, I want it to work "as-is" with the default Python libraries. Can this accomplish that in one or two lines of code?
  • Jan Vlcinsky
    Jan Vlcinsky almost 10 years
    @Blairg23 I misunderstood your question. New answer added (as another answer).
  • Blairg23
    Blairg23 almost 10 years
    Very cool. I am currently using subprocess to run my terminal commands! The problem is that my user will be not be able to specify where the directory is, the solution needs to be more robust and to find or "discover" arbitrary Python modules and add them to the current usable commands. Basically, the user should be able to run my script and use Python modules as if they were Shell commands. In other words, if I had a module named runMe.py that was located in /path/to/runMe.py, I want the user to be able to say python runMe.py -arg1 -arg2 and not be concerned with its location.
  • Jan Vlcinsky
    Jan Vlcinsky almost 10 years
    @Blairg23 You mentioned, you know, how to discover the files to call. If you have file name, than use os.path functions to split directory and base file name.
  • Blairg23
    Blairg23 almost 10 years
    Ah, very good! I didn't even think of that! The only problem I find is that right now, the script the user runs does not know whether or not the item is a module or a regular terminal command and I would like to keep it that way.... my ModuleManager.py script does all the discovering and adding to path for easy importing, but it has no way to tell the main script subprocess command to use the cwd="script_dir" parameter. Can you think of a work-around? Honestly, adding all the subdirectories to the path would be ideal.
  • Jan Vlcinsky
    Jan Vlcinsky almost 10 years
    @Blairg23 If you manage importing that file, try __path__ to learn, where it lives
  • Blairg23
    Blairg23 almost 10 years
    Correct, but the main problem I'm having isn't being able to import the file. I'm not trying to import the file whatsoever. I simply want to be able to run it. I want the main script to do the following: 1) discover all subdirectory modules (without importation) and 2) run them like terminal commands with arguments. The problem is, I don't want to import all of the modules I discover (since most of them are modules that are in the main directory and don't need special importation). Once I discover them, I want to assume they all are in the same directory. Meanwhile, I want them organized.
  • Blairg23
    Blairg23 almost 10 years
    I would just put them in the same directory which would solve all my problems, but I want to organize them in subdirectories for the human users. I want something to the effect of /tools/networking/net_tool.py and /tools/audio/audio_tool.py, etc. for easy organization. But in the end, I want the main script to not know the difference between a module and a regular terminal command. Therefore, the main script doesn't do anything aside from running strings, which it assumes are legal terminal commands.