Notice, you need to fine-tune your own model, but this is automatically done for you by simply pointing Magic to your existing website, at which point it will automagically crawl your website, and scrape it for data, generating a custom machine learning AI model in the process, capable of answering intelligent questions about your site.
Edit - Fine-tuning is no longer required, and you should rather use the vectorise feature after creating your model by scraping your website or importing files/training-snippets.
The above website has roughly 70 pages, and after scraping our site, we're able to get an accuracy of roughly 50% in answers. However, importantly, the module logs all questions/answers, and allows you to reinforce it as you've gathered more training data, to strengthen its result. My guess is that over a month or two, the above bot will be able to provide an accuracy of 95+ percent, simply due to people asking it questions, and me "correcting" its answers, and re-training it on my corrections a handful of times. You can see me demonstrating the thing in the video below.
Notice, since this article was published we've updated the process, significantly improved accuracy, by relying upon "vectorising" instead of "fine-tuning" - See other articles related to this for more details.