I have an Azure SQL database with 20 million + records that i need to manipulate (including joining other tables, etc) before inserting into Azure Search.
The procedure needs to be reproducible in case i need to rebuild the Azure search index.
Some of the data in the search is quite complex i.e. a json model.
The routine needs to be as fast as possible.
Current Ideas
SQL - Write a SQL view and use the Azure Search Data Sources to import the data - This worked in the past but some of the data manipulation is to complex to recreate via a SQL view.
Azure Function - Write a function (in c# (that's what i am comfy working in)) that reads the SQL, does the manipulation and inserts the data to in Azure Search. Would this scale? How long would it take to insert that many records?
Any other ideas\process\service that would get the data into the search?