Note from the author: In this article, we will learn how to create your own Question and Answering(QA) API using python, flask, and haystack framework with docker. The haystack framework will provide the complete QA features which are highly scalable and customizable. In this article Medium Rules, the text will be used as the target document and fine-tuning the model as well.
Basic Knowledge Required: Elasticsearch & Docker
This article contains the working code which can be directly build using docker.
Haystack raises queries to the document which is available at DocumentStore. There are various DocumentStore included in haystack which are ElasticsearchDocumentStore, SQLDocumentStore, and InMemoryDocumentStore. In this article, I am going to use Elasticsearch which is recommended. It comes up with preloaded features like full-text queries, BM25 retrieval, and vector storage for text embeddings.
Run the below command for the installation of Elasticsearch
verify the installation by browsing the below URL. If install successfully it will display as in below:
Haystack framework has main 3 basic components DocumentStore, Retriever, and Reader which we have to select based on our requirements.
DocumentStore: As recommended ElasticsearchDocumentStore will be used in this article. It comes preloaded with features like full-text queries, BM25 retrieval, and vector storage for text embeddings. Documents should be chunked into smaller units (e.g. paragraphs) before indexing to make the results returned by the Retriever more granular and accurate.
Retrievers: The answer needs to be display based on similarities on the embedded. So DensePassageRetriever is the powerful alternative to score the similarity of texts.
Readers: Farm Reader going to be used in this article if you wish Transformer readers can be used. For better accuracy “distilbert-base-uncased-distilled-squad” model going to be used by the readers.
The app structure as shown below:
In this article, we are going to use the ElasticSearch document. The application configuration is declared as in below:
Let’s implement endpoints for uploading the PDF document. The pdf document will be uploaded in the ElasticSearch with the provided index name.
Let’s implement endpoints for querying and will respond back with relevant (n)answers from the ElasticSearch document. The index needs to provide during the search query.
Flask
gunicorn
futures
farm-haystack
The Question and Answering API will be available @port 8777.
Let’s run the below command to build the docker image:
docker build -t qna:v1 .
Build and run the flask API in docker container using the below command:
docker run — name qna_app -d -p 8777:8777 xxxxxxxxx
Note: xxxxxxxxx is the image id
Confirm the docker container using the below command:
docker ps
Its will show all the process as in below:
Now the QNA API is successfully running @ http://localhost:8777
Let’s prepare a pdf document from Medium Rules which suppose to be upload in an elastic search document. Once you prepare the pdf document lets to upload using the API as in the below snapshot:
Verify the upload document as in the below snapshot:
Let’s ask a question using the qna_pretrain endpoints. Currently, we have used only the pre-train model “distilbert-base-uncased-distilled-squad” which is provided good accuracy. In the next article, I will demonstrate how to annotate and improve the model. For querying a question use the API as in the below snapshot:
Let’s try to improve the model and train ourselves as per our requirements. For any domain-specific training we have to train ourselves, the Haystack Annotation tool will be used for labeling our data. In this article, I will use the hosted version of the annotation tool from the haystack.
For the local version follow here.https://github.com/deepset-ai/haystack/tree/master/annotation_tool
Now create an account with your personal email id and upload the txt document. Annotate your question and answers as in the below snapshot:
Now export all your labels in SQUAD format as “answers.json” and copy the file to the docker container with the below command:
After the collection of the training labels, this python utility will help us to train and fine-tuning the model base on our requirements. Let’s proceed with the training with 10 epochs.
Use the below docker command to execute from the docker container:
To start the training run the command “python util-trainer.py “
At the end of the training, the model will be saved in the train_model directory. Now training is successfully completed.
Let’s modify the code to subscribed the trained and pre-trained model in the same endpoint. We are going to add the additional parameter “mode” to be classified the model.
Published the code to the docker contained with the below command:
docker copy main.py qna_app:/usr/src/app/main.py
Now restart the container to reflect the changes in the API
docker restart qna_app
Now the API is ready to subscribe with the trained & pre-trained model.
Now, let’s compare the two models (trained and pre-trained) and analyze the most probability answer. Use the below API as in the snapshot:
the trained model gives the perfect results as we provided the label in training data.
I hope we have learned to develop our own Question and Answering API by fine-tuning the model.
The same article is published on my medium channel
The complete source code is available here
The media shown in this article are not owned by Analytics Vidhya and is used at the Author’s discretion.
Build Powerful Chat Assistant for PDFs and Arti...
Building RAG Application using Cohere Command-R...
Self Hosting RAG Applications On Edge Devices w...
RAG Powered Document QnA & Semantic Cachin...
Tutorial to deploy Machine Learning models in P...
Mastering Arxiv Searches: A DIY Guide to Buildi...
Shipping your Machine Learning Models With Dockers
Flask Application Dockerization: Creating a Doc...
End-to-End NLP Project on Quora Duplicate Quest...
An End-to-End Guide on Approaching an ML Proble...
We use cookies essential for this site to function well. Please click to help us improve its usefulness with additional cookies. Learn about our use of cookies in our Privacy Policy & Cookies Policy.
Show details
This site uses cookies to ensure that you get the best experience possible. To learn more about how we use cookies, please refer to our Privacy Policy & Cookies Policy.
It is needed for personalizing the website.
Expiry: Session
Type: HTTP
This cookie is used to prevent Cross-site request forgery (often abbreviated as CSRF) attacks of the website
Expiry: Session
Type: HTTPS
Preserves the login/logout state of users across the whole site.
Expiry: Session
Type: HTTPS
Preserves users' states across page requests.
Expiry: Session
Type: HTTPS
Google One-Tap login adds this g_state cookie to set the user status on how they interact with the One-Tap modal.
Expiry: 365 days
Type: HTTP
Used by Microsoft Clarity, to store and track visits across websites.
Expiry: 1 Year
Type: HTTP
Used by Microsoft Clarity, Persists the Clarity User ID and preferences, unique to that site, on the browser. This ensures that behavior in subsequent visits to the same site will be attributed to the same user ID.
Expiry: 1 Year
Type: HTTP
Used by Microsoft Clarity, Connects multiple page views by a user into a single Clarity session recording.
Expiry: 1 Day
Type: HTTP
Collects user data is specifically adapted to the user or device. The user can also be followed outside of the loaded website, creating a picture of the visitor's behavior.
Expiry: 2 Years
Type: HTTP
Use to measure the use of the website for internal analytics
Expiry: 1 Years
Type: HTTP
The cookie is set by embedded Microsoft Clarity scripts. The purpose of this cookie is for heatmap and session recording.
Expiry: 1 Year
Type: HTTP
Collected user data is specifically adapted to the user or device. The user can also be followed outside of the loaded website, creating a picture of the visitor's behavior.
Expiry: 2 Months
Type: HTTP
This cookie is installed by Google Analytics. The cookie is used to store information of how visitors use a website and helps in creating an analytics report of how the website is doing. The data collected includes the number of visitors, the source where they have come from, and the pages visited in an anonymous form.
Expiry: 399 Days
Type: HTTP
Used by Google Analytics, to store and count pageviews.
Expiry: 399 Days
Type: HTTP
Used by Google Analytics to collect data on the number of times a user has visited the website as well as dates for the first and most recent visit.
Expiry: 1 Day
Type: HTTP
Used to send data to Google Analytics about the visitor's device and behavior. Tracks the visitor across devices and marketing channels.
Expiry: Session
Type: PIXEL
cookies ensure that requests within a browsing session are made by the user, and not by other sites.
Expiry: 6 Months
Type: HTTP
use the cookie when customers want to make a referral from their gmail contacts; it helps auth the gmail account.
Expiry: 2 Years
Type: HTTP
This cookie is set by DoubleClick (which is owned by Google) to determine if the website visitor's browser supports cookies.
Expiry: 1 Year
Type: HTTP
this is used to send push notification using webengage.
Expiry: 1 Year
Type: HTTP
used by webenage to track auth of webenagage.
Expiry: Session
Type: HTTP
Linkedin sets this cookie to registers statistical data on users' behavior on the website for internal analytics.
Expiry: 1 Day
Type: HTTP
Use to maintain an anonymous user session by the server.
Expiry: 1 Year
Type: HTTP
Used as part of the LinkedIn Remember Me feature and is set when a user clicks Remember Me on the device to make it easier for him or her to sign in to that device.
Expiry: 1 Year
Type: HTTP
Used to store information about the time a sync with the lms_analytics cookie took place for users in the Designated Countries.
Expiry: 6 Months
Type: HTTP
Used to store information about the time a sync with the AnalyticsSyncHistory cookie took place for users in the Designated Countries.
Expiry: 6 Months
Type: HTTP
Cookie used for Sign-in with Linkedin and/or to allow for the Linkedin follow feature.
Expiry: 6 Months
Type: HTTP
allow for the Linkedin follow feature.
Expiry: 1 Year
Type: HTTP
often used to identify you, including your name, interests, and previous activity.
Expiry: 2 Months
Type: HTTP
Tracks the time that the previous page took to load
Expiry: Session
Type: HTTP
Used to remember a user's language setting to ensure LinkedIn.com displays in the language selected by the user in their settings
Expiry: Session
Type: HTTP
Tracks percent of page viewed
Expiry: Session
Type: HTTP
Indicates the start of a session for Adobe Experience Cloud
Expiry: Session
Type: HTTP
Provides page name value (URL) for use by Adobe Analytics
Expiry: Session
Type: HTTP
Used to retain and fetch time since last visit in Adobe Analytics
Expiry: 6 Months
Type: HTTP
Remembers a user's display preference/theme setting
Expiry: 6 Months
Type: HTTP
Remembers which users have updated their display / theme preferences
Expiry: 6 Months
Type: HTTP
Used by Google Adsense, to store and track conversions.
Expiry: 3 Months
Type: HTTP
Save certain preferences, for example the number of search results per page or activation of the SafeSearch Filter. Adjusts the ads that appear in Google Search.
Expiry: 2 Years
Type: HTTP
Save certain preferences, for example the number of search results per page or activation of the SafeSearch Filter. Adjusts the ads that appear in Google Search.
Expiry: 2 Years
Type: HTTP
Save certain preferences, for example the number of search results per page or activation of the SafeSearch Filter. Adjusts the ads that appear in Google Search.
Expiry: 2 Years
Type: HTTP
Save certain preferences, for example the number of search results per page or activation of the SafeSearch Filter. Adjusts the ads that appear in Google Search.
Expiry: 2 Years
Type: HTTP
Save certain preferences, for example the number of search results per page or activation of the SafeSearch Filter. Adjusts the ads that appear in Google Search.
Expiry: 2 Years
Type: HTTP
Save certain preferences, for example the number of search results per page or activation of the SafeSearch Filter. Adjusts the ads that appear in Google Search.
Expiry: 2 Years
Type: HTTP
These cookies are used for the purpose of targeted advertising.
Expiry: 6 Hours
Type: HTTP
These cookies are used for the purpose of targeted advertising.
Expiry: 1 Month
Type: HTTP
These cookies are used to gather website statistics, and track conversion rates.
Expiry: 1 Month
Type: HTTP
Aggregate analysis of website visitors
Expiry: 6 Months
Type: HTTP
This cookie is set by Facebook to deliver advertisements when they are on Facebook or a digital platform powered by Facebook advertising after visiting this website.
Expiry: 4 Months
Type: HTTP
Contains a unique browser and user ID, used for targeted advertising.
Expiry: 2 Months
Type: HTTP
Used by LinkedIn to track the use of embedded services.
Expiry: 1 Year
Type: HTTP
Used by LinkedIn for tracking the use of embedded services.
Expiry: 1 Day
Type: HTTP
Used by LinkedIn to track the use of embedded services.
Expiry: 6 Months
Type: HTTP
Use these cookies to assign a unique ID when users visit a website.
Expiry: 6 Months
Type: HTTP
These cookies are set by LinkedIn for advertising purposes, including: tracking visitors so that more relevant ads can be presented, allowing users to use the 'Apply with LinkedIn' or the 'Sign-in with LinkedIn' functions, collecting information about how visitors use the site, etc.
Expiry: 6 Months
Type: HTTP
Used to make a probabilistic match of a user's identity outside the Designated Countries
Expiry: 90 Days
Type: HTTP
Used to collect information for analytics purposes.
Expiry: 1 year
Type: HTTP
Used to store session ID for a users session to ensure that clicks from adverts on the Bing search engine are verified for reporting purposes and for personalisation
Expiry: 1 Day
Type: HTTP
Cookie declaration last updated on 24/03/2023 by Analytics Vidhya.
Cookies are small text files that can be used by websites to make a user's experience more efficient. The law states that we can store cookies on your device if they are strictly necessary for the operation of this site. For all other types of cookies, we need your permission. This site uses different types of cookies. Some cookies are placed by third-party services that appear on our pages. Learn more about who we are, how you can contact us, and how we process personal data in our Privacy Policy.
Edit
Resend OTP
Resend OTP in 45s
Very good and to the point article with coding examples. Thanks.