I'm a computer engineer with a speciality in web development. I can deal with anything the world throws at me. I never give up and I always enjoy learning how things work.
In my free time, I like to game, read, play on the guitar, and play chess.
Check out my Projects, Work Experience and Education. Also feel free to contact me via Email, LinkedIn and GitHub.
A stable diffusion based model I developed during my time at AAST college of Artificial Intelligence that can generate images from text input. A web interface was developed to enable easy access to the model.
A large language model (LLM) I developed based on Facebook’s Llama that can understand the Arabic language. This model was developed during my research period at AAST college of Artificial Intelligence.
A script I wrote that does the tedious bureaucratic tasks at work (AASTMT). Sisyphus logs into my univerity's students portal then downloads each student's schedule. After that, Sisyphus parses each schedule and groups all students that are enrolled in the same course into separate excel files for each lecturer.
This is my graduation project and I'm responsible for its backend. Jamify is a platform for musicians that offers real-time jamming with different instruments as well as a personalized social media platform that aims towards enriching the music community.
A Twitter bot that scrapes azlyrics.com for song lyrics then tweets a chosen portion of the lyrics. A CLI program was developed to allow filtering songs by artists, albums, and release dates. This project was offered to the open-source community event Hacktoberfest 2022.
A configurable CLI tool I developed during my internship at Bibliotheca Alexandrina that accesses and parses .WAT files produced by BibAlex's web crawlers in the archive department. Its main purpose is to index links found in those files and build parent-child relations between them before communicating with arcapi to store the indexed links.
An API I developed during my internship at Bibliotheca Alexandrina that stores and retrieves links with different search filters received from the CLI tool lindexer utilizing a Neo4j graph database. This API is used by different development teams at BibAlex.
A website developed while taking Udacity's Full Stack Web Developer Nanodegree. It has a simple backend that offers CRUD operations and JSON endpoints while the frontend consists of Jinja2 templates.
A website developed for a local foods company in Egypt that contains basic info about them, their inventory and contacts.
A Twitter management portal that allows users to search, monitor and analyze tweets' reach and engagements. Users can also deploy and control custom bots from their accounts.
An open source Twitter bot that takes a list of subreddits then fetches random media from them and tweets them periodically while marking all used images to prevent duplication.
I currently work on eSpace’s own products as well as projects for high profile clients. My skills are being developed in multiple domains like data engineering, IoT, and others besides web development.
I worked there as a teaching assistant and AI researcher. I contributed to different AI and robotics projects and taught relevant courses to many students. I mentored students to progress their programming skills as I mentored some students to land jobs in the tech field as well as participate in competitions.
I worked at eSpace as a fullstack web developer (Rails + React) with exceptionally skilled developers. I work on projects for the Saudi Arabian government that impact Saudi Arabia's digital transformation and generate millions of dollars in revenue. I also work on eSpace's own products. eSpace has one of the best work environments one could ask for and I'm glad I get to work there.
I worked there as a teaching assistant. I taught both introductory and advanced computer engineering courses to many students. I developed some tools that dramatically reduced the time taken to perform administrative tasks.
I worked there for one month as a software engineering intern. I was assigned to the web archiving team, this team is responsible for deploying and operating web crawlers that have been running since 2002, mainly targeting Arabic websites. My task was to develop tools that would access and parse all the crawled data and store relevant links in a format that would make it feasible for other development teams to use. During my time there I developed lindexer and arcapi.