I have a lot of text and I am storing it in Elastic search. Using Lucene, NLP and Wordnet filters the search is good but not as good as Google's because none of these methods use AI for the search so that it can understand questions or some of the meaning of the text. Even if Google is not using it heavily, their search will get much better than mine as time passes. I researched into technologies such as Babelnet that give you meaningful word relations with super types and sub types but then I thought:
Is it feasible to make a static page with links to other pages (one per database entry) and tell Googlebot to parse them and then when a user searches through my web application I run a search query from Google for my URL and return their results?
What are the cons? (I can see the pros I think).