JSP program to capture web page code
Author:Eve Cole
Update Time:2009-07-02 17:12:32
<%@ page contentType="text/html;charset=gb2312"%>
<%
String sCurrentLine;
String sTotalString;
sCurrentLine="";
sTotalString="";
java.io.InputStream l_urlStream;
java.net.URL l_url = new java.net.URL("http://www.163.net/");
java.net.HttpURLConnection l_connection = (java.net.HttpURLConnection) l_url.openConnection();
l_connection.connect();
l_urlStream = l_connection.getInputStream();
java.io.BufferedReader l_reader = new java.io.BufferedReader(new java.io.InputStreamReader(l_urlStream));
while ((sCurrentLine = l_reader.readLine()) != null)
{
sTotalString+=sCurrentLine;
}
out.println(sTotalString);
%>
postscript
Although the code is relatively simple, I think that based on this, the function of a "web crawler" can be realized, such as finding the href connection from the page, then getting that connection, and then "grabbing" without stopping (of course the number of layers can be limited) , in this way, the "web page search" function can be realized.