It's interesting that the idea of this project arised when I looked at the Playboy site :-[ There were a lot of nice pictures there, which I wanted to download, but some of them were really small. So, I decided to make a programm, which would
- download only certain kind of content (e.g. only pictures)
- have a number of download filters (e.g. only pictures, which are larger than 400*400)
- remember files, which were already downloaded, to save internet traffic
Frankly speaking, we didn't implemented all the functionality which I wanted. But at least we made a working console application, which receives an URL, load all the data from this URL, parses the links up to the configurable DEPTH and loads all the pages and all the data from these pages in C++ with the help of libcurl and boost. Not bad for a student project, yeah? 8-).
You can download the result here or look at source code here.
P.S. Thanks for my teammates Nazar Grabovskyy and Dmitriy Krasikov ;-)
P.P.S. It would be nice to finish this application sometimes :-)
No comments:
Post a Comment