Help needed using sed to replace a url in 1000's of web pages


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Help needed using sed to replace a url in 1000's of web pages
# 1  
Old 08-15-2008
Help needed using sed to replace a url in 1000's of web pages

Hi,

I'm new to scripting. I understand the concepts and syntax of some commands but have difficulty with others and combining actions to achieve what I'm trying to do so hope someone on here can help.

A long while back I inherited a website with 1000's of pages most of which were created by a tool I'm unable to use to recreate them from fresh. A lot of these pages have a hard coded link to the home page of the domain where it was originally hosted.

Having changed the domain to something more suitable I'm now stuck with 1000's of pages within perhaps 50 directories that have incorrect and links to the home page of a site that no longer exists.

I understand that sed can easily be used to replace all occurances of a string with another
Code:
's/{old value}/{new value}/'

but I'm having difficulty in extending this to recursively act on a whole site going into all directories and replacing all the old url's with new links.

Some pages may be up to 3 directories deep from the home page but having a clue as to how to execute the command for all files within a single directory would be a start.

A while back someone did something similar to this for me but due to some memory limit on how it was executed any file over 8k in size lost everything after the 8k size mark.

I have access to a unix box where I can place the site to run the command /script and then use the output to replace the existing site.

Any suggestion as to where to start would be much appreciated.

Cheers

Bob
# 2  
Old 08-15-2008
Code:
find /path/to/root/of/docs -type f -exec sed -i 's%http://%https://%g' {} \;

Something along those lines. sed -i might not be available to you, but this same question gets answered here almost daily, so searching along these lines will hopefully help you find something which works for you.
# 3  
Old 08-15-2008
MySQL Cheers

Many thanks for your time. I've had a tinker around, escaped the slashes in the URL's and it works a treat.

Cheers again,

Bob
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Replace URL using sed

Original Line {background-image:url('http://www.myoldhost.com/images/scds/tsp3.png');} Expected {background-image:url('http://www.mynewhost.com/nndn/hddh/ccdcd.png');} I am using following syntax STATIC_HOST_TEMP="http://myhost.com/temp/xyx.png" $sed -e... (1 Reply)
Discussion started by: 8055
1 Replies

2. Shell Programming and Scripting

Get web pages and compare

Hello I'm writing a script to get content of web pages on different machines and compare them using their md5 hash hear is my code #!/bin/bash # Cluster 1 CLUSTER1_SERVERS="srv01:7051 srv02:7052 srv03:7053 srv04:7054" CLUSTER1_APPLIS="test/version.html test2/version.html... (2 Replies)
Discussion started by: gtam
2 Replies

3. Shell Programming and Scripting

Get web pages and compare

Hello, I'm writing a shell script to wget content web pages from multiple server into a variable and compare if they match return 0 or return 2 #!/bin/bash # Cluster 1 CLUSTER1_SERVERS="srv1 srv2 srv3 srv4" CLUSTER1_APPLIS="test/version.html test2.version.jsp" # Liste des... (4 Replies)
Discussion started by: gtam
4 Replies

4. Shell Programming and Scripting

sed replace after pattern help needed

Hello, I have a file with multiple lines like this: /film/4295/"_class="titre_article">50/50I would like to change all occurence of / after > with _ to have this: /film/4295/"_class="titre_article">50_50Thank you edit: This could also be change all / starting with the 4th occurrence... (2 Replies)
Discussion started by: patx
2 Replies

5. Shell Programming and Scripting

Checking Web Pages?

Hey guys, Unfortunatley, I can not use wget on our systems.... I am looking for another way for a UNIX script to test web pages and let me know if they are up or down for some of our application. Has anyone saw this before? Thanks, Ryan (2 Replies)
Discussion started by: rwcolb90
2 Replies

6. UNIX for Dummies Questions & Answers

curl command with web pages

I can't quite seem to understand what the curl command does with a web address. I tried this: curl O'Reilly Media: Tech Books, Conferences, Courses, News but I just got the first few lines of a web page, and it's nowhere on my machine. Can someone elaborate? (2 Replies)
Discussion started by: Straitsfan
2 Replies

7. Shell Programming and Scripting

Investigating web pages in awk

hello. i want to make an awk script to search an html file and output all the links (e.g .html, .htm, .jpg, .doc, .pdf, etc..) inside it. also, i want the links that will be output to be split into 3 groups (separated by an empty line), the first group with links to other webpages (.html .htm etc),... (1 Reply)
Discussion started by: adpe
1 Replies

8. Shell Programming and Scripting

Windows driver needed for 1000 base tx card (HP)

First of all, excuse my ignorance in my questions, but truth is, I know nothing about Unix. I have recently purchased some A7012A's (dual port, 1000 base T/X) gigabit cards and need to use them in a windows environment. I am trying to see if it is possible to have drivers written for the card... (0 Replies)
Discussion started by: poaking
0 Replies

9. Shell Programming and Scripting

Count links in all of my web pages

Counts the number of hyperlinks in all web pages in the current directory and all of its sub-directories. Count in all files of type "*htm" and "*html" . i want the output to look something like this: Total number of web pages: (number) Total number of links: (number) Average number of links... (1 Reply)
Discussion started by: phillip
1 Replies

10. UNIX for Dummies Questions & Answers

Dynamic web pages for Unix Web Server

Hi, my company is considering a new development of our web site, which used to run on Apachi over Solaris. The company who is going to do this for us knows only about developing it in ASP. I guess this means we'll have to have another ISS server on NT for these dynamic pages :( What are... (5 Replies)
Discussion started by: me2unix
5 Replies
Login or Register to Ask a Question