I load a data from GBQ in Jupiter Notebook with pd.read_gbq(). Then I preprocess them and do ML model (I know about BigQueryML, but that's not enough for my tasks). I know how to load results in GBQ, but I dont know how to make it automatically. I need train model and predict every day. Is there some opportunities with cloud services? Maybe with using Colab?
You can use Google Cloud Composer, is a service that allows users to manage workflow orchestration that let you schedule and monitor pipelines, is built on Apache Airflow, you can write complex workflows using Python.
Composer use DAG's(Directed acyclic graph), every vertex of a DAG represents one task and the edges represent the direction to the next task.
So you can schedule a DAG to load data from BigQuery, train your ML Model and then load the results to BigQuery.
I give you a guide on how to write a DAG with python, the Python Operator also has the packages to interact with BigQuery or Tensoflow.