When using Google Data Prep, I am able to create automated schedules to run jobs that update my BigQuery tables.
However, this seems pointless when considering that the data used in Prep is updated by manually dragging and dropping CSVs (or JSON, xlsx, whatever) into the data storage bucket.
I have attempted to search for a definitive way of updating this bucket automatically with files that are regularly updated on my PC, but there seems to be no best-practice solution that I can find.
How should one go about doing this efficiently and effectively?