Packaging Huge NodeJS file in RPM

My work involves managing and developing a NodeJS project. Recently, the whole project, including the npm dependencies have reached 1GB. This caused us to keep getting:

Unable to create immutable headers

After searching around, I found that Software Development at BBC has encountered the same problem and they made a Medium article about it.  But before I started using their solutions, I discovered a much better one. Despite all of that, I still enjoyed their Heaviest Object In The Universe meme,

Reducing Dependencies

The solution sounds very simple, compress your project. The moment I started getting the rpm header error, the solution was floating at the back of my head. But I thought, there must be some sort of RPM flag that allows us to increase the limit on the header size. But after searching for very long, I didn’t find any. We even stumble on the rpm source code responsible for the hard limit. After some headaches, I went ahead and did what I had in mind in the first. To summarize, it involves the following steps:

  1. Compress/Tar after installing your npm dependencies.
  2. During rpm post, uncompress the tar files.

How this solves the problem? Instead of rpm building a header to track thousands of file, now you only have to watch the one compressed/tar file along with the other install files.


With cmake, compressing is as easy as adding another step for creating a tarball. Consider the snippet below:

# CMake code...

  COMMENT "Creating project tarball"
  DEPENDS other_deps)

install(FILES ${CMAKE_CURRENT_SOURCE_DIR}/ DESTINATION share/sampleproject)

# CMake code...


# CMake code...

Here we begin by creating a target for the tarball of the source or build files. Since you probably did some transpiling and uglify process on your javascript, that you probably want to create a tarball of your build files. Here I used


where the parameters I used are c, j, and f. c is simply a flag to compress, f recieves the output file and j instructs to use the bzip2 compression algorithm. The compression algorithm could be anything, but we chose it since we saved another 100MB of space by doing so. Finally the dot at the end is simply the directory to compress, it just happens that I execute this under the build directory, thus compressing everything inside.

Next, is the install section, instructing cmake to the relevant files and where to put them during install. Here we used

install(FILES ${CMAKE_CURRENT_SOURCE_DIR}/ DESTINATION share/sampleproject)

where we instruct cmake to take our tarball ${CMAKE_CURRENT_SOURCE_DIR}/ and put it at the value of the variable share/sampleproject, which is prepended with a CMAKE_INSTALL_PREFIX. (Note: CMAKE_INSTALL_PREFIX often defaults to /usr, thus the default install path in my case is /usr/share/sampleproject.)

Now we know that once we build and call

make install

we will have a directory /usr/share/sampleproject/ But how do we decompress our tar so we can actually have our node build files? We now proceed with the next line in our cmake snippet:


Uncompress/Untar In Post

The snippet just mentioned refers to a local file post nested in the rpm of the sample project above. Your post probably rest somewhere else. In any case, this script is executed by yum/rpm after the tarball is copied to the install directory. This is where we want to the the untar process. A snippet of my rpm/post is the following:

# ... rpm/post stuff.

echo "Uncompressing kibana tarball."
tar -jxf /usr/share/sampleproject/ -C /usr/share/sampleproject

# ... rpm/post stuff.

Here I have the x, j, f, and C. x instead of c for extracting instead of compressing. j again for bzip2 algorithm. f referring to the path of the tarball file we copied to the install directory in the cmake file above. Finally C is the working directory, where the output of the extraction is placed, which is just the install directory.

That’s all folks!

That’s It. To summarize, you just need to somehow shove tarring during build and untarring during install. Who knew this old tool finds its way in this era of uneconomical software size. Web assembly really can’t come fast enough.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.