this might be very silly question but I am getting a hard time to figure out a way to copy data inside my BIGSQL table
to a .txt
in local file system.
Depending upon the size of the resulting data file, you can use the export command to get the data into one text file. The resulting file will end up on one node.
I used the following script as an example:
\connect bigsql
drop table if exists stack.issue2;
create hadoop table if not exists stack.issue2 (
f1 integer,
f2 integer,
f3 varchar(200),
f4 integer
)
stored as parquetfile;
insert into stack.issue2 (f1,f2,f3,f4) values (0,0,'Detroit',0);
insert into stack.issue2 (f1,f2,f3,f4) values (1,1,'Mt. Pleasant',1);
insert into stack.issue2 (f1,f2,f3,f4) values (2,2,'Marysville',2);
insert into stack.issue2 (f1,f2,f3,f4) values (3,3,'St. Clair',3);
insert into stack.issue2 (f1,f2,f3,f4) values (4,4,'Port Huron',4);
select * from stack.issue2;
{ call sysproc.admin_cmd('export to /tmp/t1.unl of del select * from stack.issue2') };
\quit
Running the script:
jsqsh --autoconnect --input-file=./t1.sql --output-file=t1.out
Yields:
cat t1.out
+----+----+--------------+----+
| F1 | F2 | F3 | F4 |
+----+----+--------------+----+
| 0 | 0 | Detroit | 0 |
| 2 | 2 | Marysville | 2 |
| 3 | 3 | St. Clair | 3 |
| 1 | 1 | Mt. Pleasant | 1 |
| 4 | 4 | Port Huron | 4 |
+----+----+--------------+----+
+---------------+---------------+-------------+
| ROWS_EXPORTED | MSG_RETRIEVAL | MSG_REMOVAL |
+---------------+---------------+-------------+
| 5 | [NULL] | [NULL] |
+---------------+---------------+-------------+
and the exported file:
ls -la /tmp/t1.unl
-rw-r--r-- 1 bigsql hadoop 93 Mar 3 16:05 /tmp/t1.unl
cat /tmp/t1.unl
0,0,"Detroit",0
3,3,"St. Clair",3
2,2,"Marysville",2
1,1,"Mt. Pleasant",1
4,4,"Port Huron",4