To run a rather lengthy SQL query on Logic App, I saved it as a Stored Procedure in my database and then created jobs and steps to run the stored procedure as well as an owner and a target group for my job. I see my job, credentials, and target group on the Elastic Job Agent that I created to get my job to work.
When I run the query directly on the database, it runs successfully and takes about 28 minutes.
I run the job on my logic app. This means that I do not directly run the query, rather the created job that runs the stored procedure as a step.
But when I look in the Elastic Job Agent to see the result, I notice that it immediately times out. The query does not seem to execute. This I know because at the beginning of the query I empty a table and I do not see that take place.
As I mentioned, the Logic app is quite simple and it only runs this job.
The reason that I am doing all this is because of the time-out problem Logic App has for lengthy SQL queries. But now I get a timeout on Elastic Job Agent even faster than the 2-minute limit of the Logic App.
UPDATE:
When I look at the steps I went through to create credentials, I realize that I might have overlooked this step: Credentials for running jobs.
Does this mean that I should create a separate database for my jobs? There is only one database we are dealing with, which is the same that contains my tables.
UPDATE 2:
The Elastic Job agent I have created has my database assigned as Job database. So everything is apparently on the same db.
It seems that you don't need to create a separate database for Azure Elastic Job
, maybe you ignored something when creating Azure Elastic Job
.
You can refer to my process of creating it using sql:
use master;
CREATE LOGIN masteruser WITH PASSWORD='qazwsx!1111';
CREATE LOGIN jobuser WITH PASSWORD='qazwsx!1111';
use master;
CREATE USER masteruser FROM LOGIN masteruser;
use gongdb;
CREATE USER masteruser FROM LOGIN masteruser;
use gongdb;
CREATE USER jobuser FROM LOGIN jobuser;
GRANT CREATE TABLE TO jobuser;
EXEC sp_addrolemember 'db_owner', 'jobuser';
use gongdb;
-- Create a db master key if one does not already exist, using your own password.
CREATE MASTER KEY ENCRYPTION BY PASSWORD='qazwsx!1111';
-- Create a database scoped credential.
CREATE DATABASE SCOPED CREDENTIAL jobuser
WITH IDENTITY = 'jobuser',
SECRET = 'qazwsx!1111';
CREATE DATABASE SCOPED CREDENTIAL masteruser
WITH IDENTITY = 'masteruser',
SECRET = 'qazwsx!1111';
SELECT * FROM sys.database_scoped_credentials
EXEC jobs.sp_add_target_group 'DemoGroup';
SELECT * FROM jobs.target_groups WHERE target_group_name='DemoGroup';
EXEC jobs.sp_add_target_group_member 'DemoGroup',
@target_type = 'SqlServer',
@refresh_credential_name='masteruser', --credential required to refresh the databases in server
@server_name='<server>.database.windows.net'
GO
EXEC jobs.sp_add_job @job_name='CreateTableTest', @description='Create Table Test'
EXEC jobs.sp_add_jobstep @job_name='CreateTableTest',
@command=N'IF NOT EXISTS (SELECT * FROM sys.tables WHERE object_id = object_id(''Test''))
CREATE TABLE [dbo].[Test]([TestId] [int] NOT NULL);',
@credential_name='jobuser',
@target_group_name='DemoGroup'
SELECT * FROM jobs.jobs
SELECT js.* FROM jobs.jobsteps js
JOIN jobs.jobs j
ON j.job_id = js.job_id AND j.job_version = js.job_version
exec jobs.sp_start_job 'CreateTableTest';
SELECT * FROM jobs.job_executions WHERE is_active = 1 ORDER BY start_time DESC
===================update=====================
When you create an Azure Elastic Job
, you will only specify one database as your Job database.
Target db is the database you need to access. For example, if you perform table creation operations in gongdb
, your target database is gongdb
.
In order for Azure Elastic Job
to access the target database, you need to create USER
for the target database, and then create the credentials required for access.
The reason why master db
is needed is because master db
is a system database with higher authority. The USER
of the target database must be created by master
.
In the SQL I provided, both the job database and the target database are gongdb, which may cause you a misunderstanding.