To start things off simple, were just going to make a small utility that will download a web page for us and save it. To do this, we will use the HttpClient class.
You will need to add the
System.Net.Http Assembly to your project's references.
HttpClient works by establishing a connection over TCP to get a response from any server that serves web pages (or responds to HTTP requests). It's pretty simple to use, but you need to use asynchronous code. This is probably a slightly more complex example of
HttpClient, but it shows simple usage of
Our program first sets up a page to get, waits to hear back from the server (a
200 HTTP Response), converts the data to a UTF-8 encoded byte array, then lastly saves it to a file name we specify at the top. In the
Main() body, as soon as we call
DownloadWebPage(), it starts to execute the code inside, but as soon as it hits an
await statement execution returns back outside, waiting for the awaited statement to finish. But since the call to
Thread.Sleep() is there, it pauses execution inside the main thread for at least five seconds; this doesn't halt the downloading at all. So if you're on a good internet connection, the web page should finish downloading before the five seconds are over. If not, then the call to
GetAwaiter().GetResult() at the end will block execution until the
You might notice that there is a lengthier call to
dlTask.GetAwaiter().GetResult() at the end of
Main(), where we could have just called
dlTask.Wait(). We do this because if an exception is thrown by the
Task, we will get that original exception thrown to us instead of it being wrapped inside of a
AggregateException that would be produced by the
Right now this program just downloads a single web page from a single resource. A fun exercise for you would be to modify the code to download many different resources from any website.
I would like to thank AngularBeginner of Reddit for suggesting some code improvements.