I have a python script that fetches a website every few minutes and makes an API call when some conditions are met. I never really figured out all those other AWS or cloud tools and usually just resort to renting a vortual server and running the script via ssh.
That seems the way to go unless you have insane amounts of data. We had some could guys write a pipeline in AWS for us with several steps connected by lambdas that get triggered at each step. The damn thing is just not reliable. It skips some changes or sometimes doesn't work at all. I am sure it can be made reliable but I am beginning to be very skeptical of all this complexity. I think we could write the whole pipeline in 300 lines of Python with the same results but somehow this is not "cool" because it doesn't scale. Never mind that the complex stuff doesn't scale either
That sounds like it works great for your use case. The serverless stuff is best for when you want to do things like customer order processing workflows or distributed packet mirroring orchestration, etc etc