Its an idea, an abstract concept.
Its not AI, not ML, not NN, not LLMs, not automation scripts like puppeteer or selenium.
End result - the computer program browses internet.
Its a vague idea.
The question is not about technicalities involved.
The intention of the question is to spark the thoughts.
Its a single recursive function.
Its a scattered thought, it would be cool to look such program working, and cooler to build one.
https://stackoverflow.com/questions/79763073/how-to-make-computer-browse-internet-automatically
no
Why not?
A web crawler requires me to program the logics.
The concept does not require me to program the logics, instead it starts reading a seed page, stores information/knowledge breaks it down to doable actions and performs action one by one.
This design allows the information to direct the concept. Where as, in case of web crawler, me/I will have to direct it.
You want the system to have learned (discretion, taste...), but it cannot learn from a human programing logic, so it must learn either from
1. Something like Machine Learning
or..
2. Some emergent property of mathematics / computation.
If you find 2... hell that would be something.
Maybe genetic algorithms?
A good web crawler can just go through all links it finds, no programming is required.
define it
Stumbleupon but inside a for loop
Just power it up, the vendor's spyware will do exactly this for fulfilling some vendor's profits.
it would be cool to watch a computer browsing. Cooler to build one.