I am building a website with a focus on dynamic (user-generated) pages (like articles, posts, etc). I am wondering how to go about allowing external search engines to go about crawling the website including those dynamic pages.
For instance – at the moment, if you type in a technology-related question into Google, you’re likely to get user-generated results from this website (or, if you type in anything else, you will most likely get a Wikipedia article somewhere in the first page). How is this achieved? I assume there is a subtler method than simply adding each page to a sitemap. How do search engines find these pages without access to the database in which they are stored?