Sure. If the technology involved in your software startup basically involves putting a web or mobile app in front of a Rails/Django/Node appserver in front of a RDBMS or equivalent NoSQL database, and the primary business questions are "What data do we need to store, and how do we present it to the user?" then getting your CI up is important. This was a large fraction of viable startups from 2005-2013, so it certainly was a large number of startup jobs.
However, my experience is that those opportunities are largely getting tapped out, and the ones that still exist in the market today are increasingly niche. Between YC & accelerators, services for startups (Clerky/Gusto/Stripe/AWS/etc.), availability of capital, and the amount of open-source software available to build systems, it's become incredibly easy to found a startup - which means that lots and lots of people have, and picked clean most of the markets where a viable product can be made by gluing together these components.
Continued success under capitalism means doing new things that haven't been done before. The web, mobile, and tablet were new platforms that opened up computing to huge new markets. However, all the new platforms since 2010 have fizzled and failed to get mass adoption - wearables, realtime web, AR, VR, cryptocurrency. That leaves tech startups founded today trying new things, either leveraging AI & data science for new insights, trying to get access to increasingly unruly data sources that have not yet been tapped, exploring variations on blockchains & cryptographic proofs, or combining various platforms & libraries in new ways to find new uses. All of these take a lot more experimentation and have much less defined best practices than web and mobile development, but the potential upside is much higher than leveraging a database-backed webapp to solve a new set of business problems.