How We Hire Writers

custom writing

All applicants go through a series of tests that check their level of English and knowledge of formatting styles. The applicant is also required to present a sample of writing to the Evaluation Department. If you wish to find out more about the procedure, check out the whole process.

How We Ensure Quality

Our Quality Control Department checks every single order for formatting, style, word usage, and authenticity. This lets us deliver certified assignment assistance that has no Internet rivals.

Implementing SHAP and LIME

Opacities in the lungs caused by pneumonia

In spring of 2020, the spread of COVID-19 caused hundreds of thousands of deaths world wide due to the severe pneumonia in combination of immune system reactions to it. Your job is to develop an AI system that detects pneumonia. Doctors are reluctant to accept black box algorithms such as your deep learning based method – as an AI engineer you need to listen to them and try to satisfy their needs, they are your customer after all. They tell you that your automated diagnostic system that processes the imaging they give you, must be explainable.

They give you the COVID X-ray / CT Imaging dataset and:

    First you find this this implementation of the method called Local Interpretable Model-Agnostic Explanations (i.e. LIME). You also read this article and you get your hands dirty and replicate the results in your colab notebook with GPU enabled kernel(40%).
    A fellow AI engineer, tells you about another method called SHAP that stands for SHapley Additive exPlanations and she mentions that Shapley was a Nobel prize winner so it must be important. You then find out that Google is using it and wrote a readable white paper about it and your excitement grows. Your manager sees you on the corridor and mentions that your work is needed soon. You are keen to impress her and start writing your 2-3 page summary of the SHAP approach as can be applied to explaining deep learning classifiers such as the ResNet network used in (1). (40%)
    After your presentation, your manager is clearly impressed with the depth of the SHAP approach and asks for some results for explaining the COVID-19 diagnoses via it. You notice that the extremely popular SHAP Github repo already has an example with VGG16 network applied to ImageNet. You think it wont be too difficult to plugin the model you trained in (1) and explain it. (20%)

LINKS:
https://github.com/ieee8023/covid-chestxray-dataset
https://github.com/aildnont/covid-cxr
https://towardsdatascience.com/investigation-of-explainable-predictions-of-covid-19-infection-from-chest-x-rays-with-machine-cb370f46af1d
https://arxiv.org/abs/1705.07874
https://storage.googleapis.com/cloud-ai-whitepapers/AI%20Explainability%20Whitepaper.pdf
https://github.com/slundberg/shap

You can leave a response, or trackback from your own site.

Leave a Reply

Powered by WordPress | Designed by: Premium WordPress Themes | Thanks to Themes Gallery, Bromoney and Wordpress Themes