Thousands of folders and files filled with client data on the linux
Searching for data is difficult due to ever-changing directory
structures and naming schemes.
Example (exaggerating to make a point):
It goes on for about 650,000 files
I used the following command to gather all file and directory names.
"/abc" is the main directory on the linux samba server where all
company data is located:
root(at)acme:/# ls -Ralh /abc > /home/mike/file_output.txt
file_output.txt is 33 megs. and lists approximately 650,000 file names
and their directory paths.
I want to pull all the file name info. and directory path info. from
"file_output.txt" and place each file name into a postgresql database
along with the related file location information.
The ultimate goal is to be able to make an alpha-numeric query and be
presented with matching file names and their location on the samba
How do I pull the data from "file_output.txt" and place it into the
Any guidance and pointed RTFM declarations greatly appreciated.
pgsql-novice by date
|Next:||From: Sean Davis||Date: 2009-06-18 18:04:31|
|Subject: Re: How to insert data from a text file|
|Previous:||From: Lennin Caro||Date: 2009-06-18 15:10:53|
|Subject: Re: problem with sequence number using a trigger|