~~~~ Execution Summary - RUN_INGEST ~~~~
Execution finished with errors.
{'exec_id': '15b278f7-1c4b-4cad-82e2-70eb10e62ebe',
 'infos': ['2024-02-14 05:51:53.570017 INFO: Starting execution for task with name=RUN_INGEST',
           "2024-02-14 05:51:59.855685 INFO: Failed to execute 'datahub ingest', exit code 1",
           '2024-02-14 05:51:59.857924 INFO: Caught exception EXECUTING task_id=15b278f7-1c4b-4cad-82e2-70eb10e62ebe, name=RUN_INGEST, '
           'stacktrace=Traceback (most recent call last):\n'
           '  File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/default_executor.py", line 140, in execute_task\n'
           '    task_event_loop.run_until_complete(task_future)\n'
           '  File "/usr/local/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete\n'
           '    return future.result()\n'
           '  File "/usr/local/lib/python3.10/site-packages/acryl/executor/execution/sub_process_ingestion_task.py", line 282, in execute\n'
           '    raise TaskError("Failed to execute \'datahub ingest\'")\n'
           "acryl.executor.execution.task.TaskError: Failed to execute 'datahub ingest'\n"],
 'errors': []}

~~~~ Ingestion Report ~~~~
{
  "cli": {
    "cli_version": "0.12.1.5",
    "cli_entry_location": "/usr/local/lib/python3.10/site-packages/datahub/__init__.py",
    "py_version": "3.10.13 (main, Jan 17 2024, 05:40:33) [GCC 12.2.0]",
    "py_exec_path": "/usr/local/bin/python",
    "os_details": "Linux-6.5.11-linuxkit-aarch64-with-glibc2.36",
    "mem_info": "86.1 MB",
    "peak_memory_usage": "86.1 MB",
    "disk_info": {
      "total": "62.67 GB",
      "used": "17.22 GB",
      "used_initally": "17.22 GB",
      "free": "42.23 GB"
    },
    "peak_disk_usage": "17.22 GB",
    "thread_count": 1,
    "peak_thread_count": 1
  },
  "source": {
    "type": "dbt",
    "report": {
      "events_produced": 0,
      "events_produced_per_sec": 0,
      "entities": {},
      "aspects": {},
      "warnings": {},
      "failures": {},
      "soft_deleted_stale_entities": [],
      "sql_statements_parsed": 0,
      "sql_parser_detach_ctes_failures": 0,
      "sql_parser_skipped_missing_code": 0,
      "start_time": "2024-02-14 05:51:57.568132 (now)",
      "running_time": "0 seconds"
    }
  },
  "sink": {
    "type": "datahub-rest",
    "report": {
      "total_records_written": 0,
      "records_written_per_second": 0,
      "warnings": [],
      "failures": [],
      "start_time": "2024-02-14 05:51:57.220382 (now)",
      "current_time": "2024-02-14 05:51:57.569555 (now)",
      "total_duration_in_seconds": 0.35,
      "max_threads": 15,
      "gms_version": "v0.12.1",
      "pending_requests": 0
    }
  }
}

~~~~ Ingestion Logs ~~~~
Obtaining venv creation lock...
Acquired venv creation lock
venv is already set up
venv setup time = 0 sec
This version of datahub supports report-to functionality
+ exec datahub ingest run -c /tmp/datahub/ingest/15b278f7-1c4b-4cad-82e2-70eb10e62ebe/recipe.yml --report-to /tmp/datahub/ingest/15b278f7-1c4b-4cad-82e2-70eb10e62ebe/ingestion_report.json
[2024-02-14 05:51:57,143] INFO     {datahub.cli.ingest_cli:147} - DataHub CLI version: 0.12.1.5
[2024-02-14 05:51:57,222] INFO     {datahub.ingestion.run.pipeline:238} - Sink configured successfully. DataHubRestEmitter: configured to talk to http://datahub-gms:8080
[2024-02-14 05:51:57,568] INFO     {datahub.ingestion.run.pipeline:255} - Source configured successfully.
[2024-02-14 05:51:57,568] INFO     {datahub.cli.ingest_cli:128} - Starting metadata ingestion
[2024-02-14 05:51:57,569] INFO     {datahub.ingestion.reporting.file_reporter:52} - Wrote UNKNOWN report successfully to <_io.TextIOWrapper name='/tmp/datahub/ingest/15b278f7-1c4b-4cad-82e2-70eb10e62ebe/ingestion_report.json' mode='w' encoding='UTF-8'>
[2024-02-14 05:51:57,569] INFO     {datahub.cli.ingest_cli:133} - Source (dbt) report:
{'events_produced': 0,
 'events_produced_per_sec': 0,
 'entities': {},
 'aspects': {},
 'warnings': {},
 'failures': {},
 'soft_deleted_stale_entities': [],
 'sql_statements_parsed': 0,
 'sql_parser_detach_ctes_failures': 0,
 'sql_parser_skipped_missing_code': 0,
 'start_time': '2024-02-14 05:51:57.568132 (now)',
 'running_time': '0 seconds'}
[2024-02-14 05:51:57,570] INFO     {datahub.cli.ingest_cli:136} - Sink (datahub-rest) report:
{'total_records_written': 0,
 'records_written_per_second': 0,
 'warnings': [],
 'failures': [],
 'start_time': '2024-02-14 05:51:57.220382 (now)',
 'current_time': '2024-02-14 05:51:57.570033 (now)',
 'total_duration_in_seconds': 0.35,
 'max_threads': 15,
 'gms_version': 'v0.12.1',
 'pending_requests': 0}
[2024-02-14 05:51:58,058] ERROR    {datahub.entrypoints:201} - Command failed: [Errno 2] No such file or directory: '/Users/arunkumar/Documents/Datatool/data_catalog/target/manifest.json'
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/datahub/entrypoints.py", line 188, in main
    sys.exit(datahub(standalone_mode=False, **kwargs))
  File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/lib/python3.10/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/datahub/telemetry/telemetry.py", line 454, in wrapper
    raise e
  File "/usr/local/lib/python3.10/site-packages/datahub/telemetry/telemetry.py", line 403, in wrapper
    res = func(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 201, in run
    ret = loop.run_until_complete(run_ingestion_and_check_upgrade())
  File "/usr/local/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 185, in run_ingestion_and_check_upgrade
    ret = await ingestion_future
  File "/usr/local/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 139, in run_pipeline_to_completion
    raise e
  File "/usr/local/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 131, in run_pipeline_to_completion
    pipeline.run()
  File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/run/pipeline.py", line 404, in run
    for wu in itertools.islice(
  File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/api/source_helpers.py", line 126, in auto_stale_entity_removal
    for wu in stream:
  File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/api/source_helpers.py", line 150, in auto_workunit_reporter
    for wu in stream:
  File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/api/source_helpers.py", line 206, in re_emit_browse_path_v2
    for wu in stream:
  File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/api/source_helpers.py", line 247, in auto_browse_path_v2
    for urn, batch in _batch_workunits_by_urn(stream):
  File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/api/source_helpers.py", line 385, in _batch_workunits_by_urn
    for wu in stream:
  File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/api/source_helpers.py", line 163, in auto_materialize_referenced_tags
    for wu in stream:
  File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/api/source_helpers.py", line 70, in auto_status_aspect
    for wu in stream:
  File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/source/dbt/dbt_common.py", line 806, in get_workunits_internal
    all_nodes, additional_custom_props = self.load_nodes()
  File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/source/dbt/dbt_core.py", line 495, in load_nodes
    ) = self.loadManifestAndCatalog()
  File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/source/dbt/dbt_core.py", line 435, in loadManifestAndCatalog
    dbt_manifest_json = self.load_file_as_json(
  File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/source/dbt/dbt_core.py", line 422, in load_file_as_json
    with open(uri, "r") as f:
FileNotFoundError: [Errno 2] No such file or directory: '/Users/arunkumar/Documents/Datatool/data_catalog/target/manifest.json'