Relevium is a revolutionary app that can diagnose the extent of Melanoma on the user’s skin accurately and provides necessary information in tracking the progress of the disease. The app utilizes the camera module from the smartphone and a machine learning model that transcribes the feed from the camera into a diagnosis readable in the app. Relevium, from the gathering of the various data points through the scanning of the person’s skin, can then precisely track the progress of the disease and provide essential information on the next steps to take in controlling the disease. I have used the Core ML framework in my app, which enabled me to integrate the models that I have trained into Relevium. Core ML specifically provided me with the Vision framework which drives the process of analyzing images and transcribing the information into human-readable text. In training my machine learning model, I have used images from the Stanford Medical Center that show the Melanoma disease in different stages and severity to then train the model with these images in order to provide the most accurate assessment of the user’s progress of the disease. The frontend and backend of the app were made entirely out of the Swift language provided by Apple’s iOS Software Development Kit.

What inspired you (or your team)?
My grandfather has been diagnosed with Stage 2 Melanoma, which means that the disease is spreading and that the patient who has this disease needs to have the doctor perform treatment frequently. Since he is in the hospital for majority of his time and that he has to take frequent trips there, I decided that there must be a way to help my grandfather diagnose the extent of his disease quicker, that is, not going to the doctor’s office just to check on the disease, which takes a lot of time out of his day. I then realized that over 80% of Americans have a smartphone, my grandfather included, and that since everyone checks their phone daily, I could introduce the ability for people to check on the status of Melanoma right through their smartphone. From this, I decided to make a mobile app, made for Android and iOS, that could do just that: diagnose the extent of Melanoma and provide detailed information about the disease, including its likelihood of spreading or worsening on the patient’s body. After some time making the application, I finally made it function as intended. Shortly after, I tested my app on my grandfather in the hospital, where I did most of the development of my app such as gathering the data to train my machine learning model, to test if the app functioned and gave the appropriate diagnosis. I was thrilled to find out that my app worked on my grandfather and it diagnosed the caliber of the disease perfectly, which was Stage 2 Melanoma with a medium to high chance of spreading and worsening. I was glad that I was able to not only save a substantial amount of time from my grandfather’s day but also to use my skills in machine learning and app creation in order to make this dream possible. In the future, I am looking forward to releasing my app to the public and making it open source so that it could benefit other people who have Melanoma and are looking for a way to save time in receiving precise, accurate daily updates on the extent of the disease on their body.