Introduction
Web Crawler is an automatic way to browse the internet and get information from HTML or Database by Get or Post method.
1.What is Get?
When you browse a web page, typically you are using the get method to request the data.
First open your browser, and find the developer tools.
↓
And then refresh your page, find the Tag: Network
↓
The "General" column shows the information including URL, Request Method, Status Code (200 is good) and other information. By using Get method, the server return final result to your computer.
2.What is Post?
Unlike Get method, Post method, however, posts more information to the server, which depends on users.
Go to request page, and open developer tools.
↓
Then send the request
↓
So you can see the server call a script "summaries.pl" by your requests setting, and return results to form the table.
Your requests setting are shown in Form Data.
Based on this conception, we can change the Form Data to get what we want automatically, and save the useful information.