Regarding keboola custom science application decommission, there are two ways that we can take to move the extractor from Custom Science Application into AWS S3 extractor with csv format. Says that I have Custom Science Python collected in in.c-raw-data.flight_issued
, here are the steps:
in.c-keboola-ex-aws-s3-440188041
. All tables under this one bucket are the output of extracting the data in one AWS S3 Extrator, for example: https://connection.keboola.com/admin/projects/1218/extractors/keboola.ex-aws-s3/440188041
in.c-keboola-ex-aws-s3-440188041.flight_issued_with_marketing_attributes
as in.c-raw-data.flight_issued
, first thing to do is to remove in.c-raw-data.flight_issued
(copy them first to in.c-keboola-ex-aws-s3-440188041.flight_issued_with_marketing_attributes
if necessary). Keboola will reject alias table name for the existing table.
in.c-raw-data.flight_issued
into in.c-keboola-ex-aws-s3-440188041.flight_issued_with_marketing_attributes
(copy them first to in.c-keboola-ex-aws-s3-440188041.flight_issued_with_marketing_attributes
if necessary)
in.c-keboola-ex-aws-s3-440188041.flight_issued_with_marketing_attributes
to in.c-raw-data.flight_issued
, and put those transformation in the orchestration if you want the data to be updated.