Import Fixed-Width file into a Database

Looking for an example project to get you started?
5 posts Page 1 of 1

Support_Julie

User avatar
Support Specialist
Posts: 91
Joined: Thu Mar 05, 2009 3:49 pm
Location: Ashland, NE USA

Post by Support_Julie » Tue Jun 16, 2009 10:29 am
Fixed-Width file to Database

Fixed Width file being imported:
Code: Select all
1111111111aaaaaaaaaaaaaaaaaaaaaaaaa 11111111.11
2222222222bbbbbbbbbbbbbbbbbbbbbbbbb-22222222.22
3333333333ccccccccccccccccccccccccc 33333333.33
Field layout of database table receiving data:
Code: Select all
PARTNBR             Character                   10      
DESCRIPTION         Character                   25       
PRICE               Packed Dec                  10,2 

This project will get a file from an FTP server, read the Fixed-Width file and then insert the data into a database table. Please note that on the SQL INSERT statement, you need to enter one question mark for each field you are inserting into. In this example three fields are being inserted into. That insert statement is: INSERT INTO DEMO.TEST3 (PARTNBR, DESCRIPTION, PRICE) VALUES(?, ?, ?)
When inserting into 2 fields you would have two questions marks in the insert statement i.e.: INSERT INTO LIB.FILE VALUES(?,?)
Code: Select all
<project name="Fixed-width to database" mainModule="Main" version="1.0" logLevel="debug">

	<module name="Main">
		<createWorkspace />

		<ftp label="Connect to FTP server" resourceId="Dev 54">
			<get label="Get Fixed-width File" sourceFile="/Demo/TEST1.TXT" destinationDir="${system.job.workspace}" type="binary" destinationFilesVariable="fixedwidthFile" />
		</ftp>

		<readFixedWidth label="Read Fixedwidth File" inputFile="${fixedwidthFile}" outputRowSetVariable="fixedwidthData" skipFirstRow="false" recordDelimiter="CRLF">
			<data>
				<column index="1" name="PARTNBR" type="CHAR" size="10" alignment="left" />
				<column index="2" name="DESCRIPTION" type="CHAR" size="25" alignment="left" />
				<column index="3" name="PRICE" size="12" alignment="right" />
			</data>
		</readFixedWidth>


		<sql label="Connect to DB" resourceId="DEV54">
			<query label="clear out file">
				<statement>DELETE 
FROM 
	DEMO.TEST3
</statement>
			</query>
			<query label="Import Data" inputRowSetVariable="${fixedwidthData}">
				<statement>INSERT INTO DEMO.TEST3
	(PARTNBR, DESCRIPTION, PRICE)
VALUES
	(?, ?, ?)</statement>
			</query>
		</sql>


		<deleteWorkspace />

	</module>

	<description>FTP Get Fixed-width file and insert into Database</description>
</project>
* When doing an FTP GET of a Fixed width file, change the transfer type to BINARY to maintain the End of Record marker of CRLF.
Julie Rosenbaum
Sr Support Analyst
e. [email protected]
p. 1.800.949.4696
w. HelpSystems.com

cmayil

Posts: 7
Joined: Thu Sep 08, 2016 8:43 am

Post by cmayil » Fri Mar 03, 2017 7:41 am
Hello Team,
Could you please let me know how to do bulk upload from csv/text file to Database.
I have text files with 22,74,700 records. Executing SQL insert query as mentioned this topic, taking longer than the expected time.
It has been executing for around 2 hours.

Pls help me on this.
Thanks

Support_Rick

Support Specialist
Posts: 590
Joined: Tue Jul 17, 2012 2:12 pm
Location: Phoenix, AZ

Post by Support_Rick » Fri Mar 03, 2017 4:34 pm
Please look at the Advanced tab under the SQL/Query task... it will give you the "batch size" option as below:
Batch Size.png
Batch Size.png (8.71 KiB) Viewed 50583 times
Rick Elliott
Lead Solutions Consultant
(402) 944.4242
(800) 949-4696

cmayil

Posts: 7
Joined: Thu Sep 08, 2016 8:43 am

Post by cmayil » Tue Mar 21, 2017 3:04 am
Hi Rick,
Thank you for your reply.
It doesnt drastically increase the performance duration.
I had set the batch size value = 1024, I afraid that my server would hung If I set the size more than this.
Without setting batch size , it took 79 minutes to get completed. After setting the batch size - 1024, it took 63 minutes.
Please refer the attached execution summary result. Is there any other way to increase the execution performance?

Thanks
Attachments
BulkUpload by setting Batch Size.JPG
Result summary
BulkUpload by setting Batch Size.JPG (30.9 KiB) Viewed 50555 times

Support_Rick

Support Specialist
Posts: 590
Joined: Tue Jul 17, 2012 2:12 pm
Location: Phoenix, AZ

Post by Support_Rick » Tue Mar 21, 2017 8:50 am
You could increment up from the 1024 ... even try 2048 and see what happens.

If you're afraid your server will hang, then you might have to look into why you feel that way about your server. Remember, this is a JDBC URL connection to your server. Making sure those connections have plenty of resources will make this run smoother. We're just giving you an option to increase the slice it sends for each transfer. Instead of 1 for each... sending 2048 for each could significantly decrease your throughput time.
Rick Elliott
Lead Solutions Consultant
(402) 944.4242
(800) 949-4696
5 posts Page 1 of 1