HTTP Protocal
I have worked with the HTTP protocal for about ten years now and I have found it to be incredibly usefull for transfering data espicaly for remote systems and regardless of the network enviroment. Prior to the existance of web services, developers use to use HTTP to screen scrap data off of web pages in order to interact with remote systems, and then process the data as they needed. I use to use the HTTPWebRequest and HTTPWebRespones classes in order to screen scrap data from various sites that had information I needed to use if no web service was avalible. This allowed me to call just about any webpage and grab all of the content on the page.
Below is piece of a web spider that I build about 5-7 years ago. The spider uses the HTTP protocal to requst webpages and then parse the data that is returned. At the time of writing the spider I wanted to create a searchable index of sites I frequented.
// C# 2.0 Framework// Creating a request for a specfic webpageHttpWebRequest webreq = (HttpWebRequest)WebRequest.Create(_Url);
// Storeing the response of the webrequestwebresp = (HttpWebResponse)webreq.GetResponse();
StreamReader loResponseStream = new StreamReader(webresp.GetResponseStream());
_Content = loResponseStream.ReadToEnd();
// Adjust the Encoding of Responsestring charset = "";EncodeString(ref _Content, ref charset);loResponseStream.Close();
//Parse Data from the Respone_Content = _Content.Replace("\n", "");_Head = GetTagByName("Head", _Content);_Title = GetTagByName("title", _Content);_Body = GetTagByName("body", _Content);