It depends on what other tools you are currently using for ETL (or want to use). One example is that we used Spark so we would use the Spark submit operator to submit jobs to clusters. You can also use the K8s Pod operator if you want to utilize containers for your compute.
There are a lot of options. We were adopters before AWS hosted airflow was a thing, so I don't have any experiencing running AWS hosted Airflow.
I haven't looked recently to see if some of the challenges we faced early on are solved now, but most of them stemmed from how DAG updates were handled: changing the start date on a DAG would break you DAG forever until you go update the database by hand. Things like this are(/were?) super painful and could get worse with a managed solution.
There are a lot of options. We were adopters before AWS hosted airflow was a thing, so I don't have any experiencing running AWS hosted Airflow.
I haven't looked recently to see if some of the challenges we faced early on are solved now, but most of them stemmed from how DAG updates were handled: changing the start date on a DAG would break you DAG forever until you go update the database by hand. Things like this are(/were?) super painful and could get worse with a managed solution.