Please back these details up with input samples and how and where to get those! How to identify the data you need. You don't expect people in here to crawl through all those sites, do you?
Hey there RudiC!
Sorry for not answering earlier and, as you'll see, I deleted all the of the "https" from my reply as the forum doesn't let me post URL until I have at least 5 posts.
You're right, I don't expect people to crawl though the site. I'm sure to understand what you mean though.
To get the data, you need to generate a listing through this link:
The URL is pretty easy to adapt and I think that I adapted it to my current needs. This will get a Densitometer equipment listing and afterwards I could easily adapt the URL myself to get to the other equipments (as the structure is the same across all equipments).
A few comments on the link itself though:
Obviously limits the output to 20 equipments. I am using 20 right now so that the requests are fast and easy but I change it to 200 to get much more informations and listings afterwards
I'm mostly interest in listing where the price is mentioned, so I decided to sort by descending prices so that I get the listings with prices first (more relevant to me).
I chose Spain as a filter, but it's not much of a relevance. I'd rather have EU listing first which is why I chose Spain.
Now back to the command:
With "curl" I'm getting the listing (I could import that listing locally into an HTML file but since that's not the objective, I get right away with the grep command).
The grep then lists the links available for the listing I specified and that's it for now.
The expected part:
The remaining of the info I mentionned earlier is now located in each URL.
What I need to do now is, based on the previous "grep":
for each link (for instance, starting with the first on in my list:
, go and get:
The price: The condition: The date_updaed:
Obviously, my objective is to try to generate a loop that will get me this info for each link in the listing, see how I could clean up the info and send it to CSV or any other similar file to stock the information.
I hope that some of this long post contains the info you were looking for? If not I apologize and please, if you could detail a little more, that'd be great!
Hi all,
Still a newbie and learning as I go ... as you do :)
Have created this script to report on disc usage and I've just included the ChkSpace function this morning.
It's the first time I've read a file (line-by-bloody-line) and would like to know if I can improve this script ?
FYI - I... (11 Replies)
Hi ,
i'm searching for files over many Aix servers with rsh command using this request :
find /dir1 -name '*.' -exec ls {} \;
and then count them with "wc"
but i would improve this search because it's too long and replace directly find with ls command but "ls *. " doesn't work.
and... (3 Replies)
Wrote this script to find the date x days before or after today. Is there any way that this script can be speeded up or otherwise improved?
#!/usr/bin/sh
check_done() {
if
then
daysofmth=31
elif
then
if
... (11 Replies)
hi someone tell me which ways i can improve disk I/O and system process performance.kindly refer some commands so i can do it on my test machine.thanks, Mazhar (2 Replies)
I have a data file of 2 gig
I need to do all these, but its taking hours, any where i can improve performance, thanks a lot
#!/usr/bin/ksh
echo TIMESTAMP="$(date +'_%y-%m-%d.%H-%M-%S')"
function showHelp {
cat << EOF >&2
syntax extreme.sh FILENAME
Specify filename to parse
EOF... (3 Replies)
I have a 10Gbps network link connecting two machines A and B. I want to transfer 20GB data from A to B using TCP. With default setting, I can use 50% bandwidth. How to improve the throughput? Is there any way to make throughput as close to 10Gbps as possible? thanks~ :) (3 Replies)
Hi All,
I have written a script as follows which is taking lot of time in executing/searching only 3500 records taken as input from one file in log file of 12 GB Approximately.
Working of script is read the csv file as an input having 2 arguments which are transaction_id,mobile_number and search... (6 Replies)
I just wrote a very small script that improves readability on system sulog. The problem with all sulog is there is lack of clarity whether the info you are looking at is the most current. So if you just need a simple soution instead of going thru the trouble of writing a script that rotate logs and... (0 Replies)
Gents.
I have 2 different scripts for the same purpose:
raw2csv_1
Script raw2csv_1 finish the process in less that 1 minute
raw2csv_2
Script raw2csv_2 finish the process in more that 6 minutes.
Can you please check if there is any option to improve the raw2csv_2. To finish the job... (4 Replies)
Gents,
Is there the possibility to improve this script to be able to have same output information.
I did this script, but I believe there is a very short code to get same output
here my script
awk -F, '{if($10>0 && $10<=15) print $6}' tmp1 | sort -k1n | awk '{a++} END { for (n in a )... (23 Replies)