Automatically Build NPM Module On Install From Github Automatically Build NPM Module On Install From Github javascript javascript

Automatically Build NPM Module On Install From Github


Edit: Detecting if the package is being installed from git repo

I didn't understand the question properly.Below are things that I wrote but are a bit off-topic.For now if you want to run build.js only when installing from repo:

Files in repo:

 .gitignore .npmignore ThisIsNpmPackage build.js package.json

The .gitginore:

ThisIsNpmPackage

The .npmignore:

!ThisIsNpmPackage

In the package.json:

"scripts": {    "install": "( [ ! -f ./ThisIsNpmPackage ] && [ ! -f ./AlreadyInstalled ] && echo \"\" > ./AlreadyInstalled && npm install . && node ./build.js ) || echo \"SKIP: NON GIT SOURCE\""}

The idea is to make file ThisIsNpmPackage available on the repo, but not in npm package.

Install hook it's just a piece of bashy script to check if ThisIsNpmPackage exists. If yes then we execute npm install . (this will ensure we have devDependencies. File AlreadyInstalled is generated to prevent infinite looping(npm install will recursively invoke install hook)

When publishing I do git push and npm publish
Note that npm publish can be automated via CI tools - githooks

This little hack with file ThisIsNpmPackage makes the source detection available.

Results of invoking npm install dumb-package:

"SKIP: NON-GIT SOURCE"

And executing npm install https://github.com/styczynski/dumb-package

Files will be built

The issues

The main issues we are facing here are the following ones:

  • Have to do npm publish ... everytime

    Sometimes it's too much pain to fix a small bug, then push to the repo and forget to publish on npm. When I was working with a microservices-based project that has about 5 standalone subprojects divided into modules the problem that I found an issue, fixed it and forget to publish in every place I had to was really annoying.

  • Don't want to push lib into the repo, because it's derived from sources

  • Rebasing and merging is even more annoying.

  • No mess with .gitgnore

    Heck, I know that problem when you have a troublesome files that you have to include inside repo but never modify them, or sometimes remove? That's just sick.

Edit: npm hooks

As @Roy Tinker mentioned there exist ability for a package to execute a command when installed.
It can be achieved via npm hooks.

"install": "npm run build"

And we execute the:

npm install https://github.com/<user>/<package>

Edit:
OP question from comments:

But this will run an install for everyone downloading the module from npm right? This is hugely problematic given that dev dependencies will not be installed for anyone downloading the module from npm. The libs used to build the app - babel etc will not be installed.

Note: But if you want a specific version of the package (production/dev) with or without dev dependencies you can install it via:

npm install --only=dev

The --only={prod[uction]|dev[elopment]} argument will cause either only devDependencies or only non-devDependencies to be installed regardless of the NODE_ENV.

A better solution, in my opinion, is to use:

npm install <git remote url>

And then inside package.json specify:

"scripts": {    "prepare": "npm run build"}

If the package being installed contains a prepare script, its dependencies and devDependencies will be installed, and the prepare script will be run, before the package is packaged and installed.

Example:

npm install git+https://isaacs@github.com/npm/npm.git

Read the npm docs there: npm install

Edit: proxy module (advanced technique)

It's kind of bad practice, but good to know.

Sometimes (as in case of Electron framework you need to install other external packages or resources/modules depending on various conditions).

In these cases the proxy idea is used:

  • You make a module that behaves like installer and installs all depending things you want

In your case prepare script will be enough, but I leave this option, because it may be sometimes helpful.

The idea is that you write a module and write a install kook for it:

"scripts": {    "install": "<do the install>"}

In this scenario you can place there:

npm install . && npm run build

Which install all devDependencies anyway (as beforementioned prepare case do), but it's a bit of hacking.

If you want do the real hacking there:

 "scripts": {    "install": "curl -L -J -O \"<some_url>\"" }

which manually download files using -nix command curl

It should be avoided but it's an option in case of the module that has huge binary files for each platform and you don't want to install them all.

Like in case of Electron where you have compiled binaries (each for the separate platform)

So you want people to make install package not install package-linux or package-window etc.

So you provide custom install script in the package.json

{  ...  "scripts": {     "install": "node ./install_platform_dep.js"  }}

Then when installing module the install_platform_dep.js script will be executed. Inside install_platform_dep.js you place:

// For Windows...if(process.platform === 'win32') {    // Trigger Windows module installation    exec('npm install fancy-module-windows', (err, stdout, stderr) => {         // Some error handling...    }} else if ... // Process other OS'es

And this in purely manual way installs everything.

Note: Once again this approach is usable with platform-depending modules and if you use that it's probably design issue with your code.

Build on CI

What comes to my mind is the solution that I used really for a long time (automatic building with CI services).

Most of the CI services' main purpose is to test/build/publish your code when pushing to the branch or doing other actions with the repo.

The idea is that you provide settings file (like travis.yml or .gitlab-ci.yml) and the tools take care of the rest.

If you really don't want to include the lib into the project, just trust CI to do everything:

  • Githook will trigger building on commit (on a branch or any other - it's just a matter of configs)
  • CI will build your files then pass them to the test phase and publish

Now i'm working on Gitlab on my own project doing (as a part of hobby) some webpage. The Gitlab configuration that builds the project looks like this:

image: tetraweb/phpcache:  untracked: true  paths:    - public_html/    - node_modules/before_script:  - apt-get updatestages:  - build  - test  - deploy  build_product:  stage: build  script:    - npm run testbuild_product:  stage: build  script:    - npm run build  deploy_product:  stage: deploy  script:    - npm run deploy

When I merge into the main branch the following events happen:

  • CI runs build stage
  • If the build succeeds then test stage is launched
  • If test phase is ok then finally the stage deploy is triggered

The script is the list of unix commands to be executed.

You can specify any Docker image in the config, so use in fact any Unix version you want with some (or not) preinstalled tools.

There is a package deploy-to-git which deploys artefacts to the desired repo branch.

Or here (for Travis CI) the piece of config that publishes artefacts to the repo:

travis-publish-to-git

(I used it by myself)

Then, of course, you can let CI run:

npm publish .

Because CI executes Unix commands then it can (at least a bunch of CI providers there):

  • Publish tags (release tag maybe?)
  • Trigger script to update version of the project in all READMEs and everywhere
  • Send you a notification if all phases succeeded

So what I do:
I commit, push and let the tools do everything else I want.
In the meantime, I make other changes and after one to ten minutes get update report by mail.

There is plenty of CI provider there:

Here I attach another example of my other project (.travis.yml):

language: genericinstall:    - npm installscript:    - chmod u+x ./utest.sh     - chmod u+x ./self_test/autodetection_cli/someprogram.sh    - cd self_test && bash ../utest.sh --ttools stime --tno-spinner

If you set up CI to push and publish your package you can be always sure to use the latest cutting-edge version of your code without worrying about eh I have to run also this command now... problem.

I recommend you to choose one of the CI providers out there.
The best ones offer you hundreds of abilities!

When you get used to automatically doing publish, test and build phases you will see how it helps to enjoy the life!
Then to start another project with automatic scripts just copy the configs!

Summary

In my opinion npm prepare script is an option.
You can also maybe want to try others.

Each of the described methods has it's drawbacks and can be used depending on what you want to achieve.
I just want to provide some alternatives hope some of them will fit your problem!


prepare is the correct way, but might seem broken

If you have a repository with source files but a "build" step is necessary to use it,
prepare does exactly what you want in all cases (as of npm 4).

prepare: Run both BEFORE the package is packed and published, on local npm install without any arguments, and when installing git dependencies.

You can even put your build dependencies into devDependencies and they will be installed before prepare is executed.

Here is an example of a package of mine that uses this method.


Problems with .gitignore - prepare will seem broken

There is one issue with this option that gets many people.When preparing a dependency, Npm and Yarn will keep only the files that are listed in the files section of package.json.

One might see that files defaults to all files being included and think they're done.What is easily missed is that:

  • .npmignore mostly overrides the files directive and,
  • if .npmignore does not exist, .gitignore is used instead.

So, if you have your built files listed in .gitignore, like a sane person, and don't do anything else, prepare will seem broken.

If you fix files to only include the built files or add an empty .npmignore, you're all set.

My recommendation is to set files (or, by inversion, .npmignore) such that the only files actually published are those needed by users of the published package. Imho, there is no need to include uncompiled sources in published packages.


Assuming there is a build script within the package.json file's scripts field, can the package be configured to run this automatically in this situation?

Yes. There are 2 things you need to do:

  1. Make sure your system uses npm or yarn to install the package from GitHub. If this package is a dependency of another package, you can use the GitHub URL in place of the version number in package.json. Otherwise, the following command will work:

    npm install https://github.com/YourUser/your-package

    You can add /tags/v1.0.0 or whatever to the end of the URL if you're after a specific tag or branch.

  2. Add the following to the scripts in your module's package.json:

    "install": "npm run build"

install is a hook that the package manager executes after installing the module. (preinstall and postinstall also -- see documentation).

Documentation: https://docs.npmjs.com/misc/scripts