if you specify the input file as '-', the script now reads from stdin, therefore you can write a script to query your oracle database and print the data in a csv like format and pipe that into nanocubes.
youroracleprogram | python csv2Nanocubes.py --latcol=lat --loncol=lon --catcol=category - | ncserve ...... (the rest of the command)
as long as the pipe from your program is open, new data can be piped into the nanocubes server.
You can also supply your own nanocube file header with --ncheader=<hdrfile> and let the python script to just parse the csv files and transform the data to the nanocube binaries.
At this point the nanocube does not support random deletion of data, but we are planning to release a new version that supports sliding temporal windows for streaming Nanocubes.
Re: [Nanocubes-discuss] Question about data feeding and updates
Hi Horace, thanks alot for your response.
I tried to do what you suggested, instead of having a script to read
data from Oracle, i used 2 CSV files just to do a POC for append. Had
the script to CAT file1.csv and then wait for 1 min and then CAT
file2.csv. Piped this script to python csv2Nanocubes.py with '-' option
and this is then piped to NC server. But don't find the data getting
appended into nanoserver properly. In this case i see the script is
completing all the steps including the wait and then sending all the
data in file1 and file2 as one set to csv2Nanocubes.py. Looks i really
batch doesn't work every hour or so.
Tried without my script, by passing both the csv files as input arg to
csv2Nanocubes.py. It worked well, but looks its processing both the
files as one set and streaming to nanoserver at once time. I tried to
insert a wait before processing the second file in csv2Nanocubes.py,
but then the data is not getting loaded into NC properly.
The scenario I am trying to do for append is as below.
1. Eg. at 9ET I have one set of records (assume 1000) that I want to
send to NC Server. I am able to do that without any issues.
2. at 10 ET I got 100 more records that I need to append to the NC
Server, without stopping the NC Server or webserver that are already
running as users are using the GUI.
3. at 11 ET few more records to append similar in #2 above and So on as
I will be getting more data every hour or so.