Working as Software Developer in the State of New Jersey, USA, after graduation in Computer Science
DevOps
Jenkins, Eclipse, GitHub, Docker, AWS, Heroku, Flask, PyCharm, Pytest, Jupyter NoteBook
Web Development
HTML, CSS, JavaScript, PHP, HTTP, Responsive Web Design, SEO, Web Analytics, Digital Marketing
Software Testing
QA Methodologies, Test Plan, Test Cases, Function Tests, Regression Tests, Test Autumation, Selenium
Networking
CISCO Packet Tracer, Telnet, Wireshark, TCP/IP, Hadoop
Programming Languages
Python, C/C++, Java, J2EE, .NET, C#, SQL, shell script
Databases
Sqlite, Oracle, MySQL
Table Tennis, Carrom, Roller Skating, Badminton, Swimming, Art and Craft, Piano, Yoga
A secure website was designed, where registered users can login to their account, and maintain a personalized database of Songs. Each Song record can store the Song information like Title, Artiste, Year, Genre etc. Authenticated users can edit, add or delete song records.
Docker was used for containerization and the product was deployed on Heroku cloud platform service
The NJIT Highlanders sports website was replicated to develop an intranet sports website. Functionality was developed to publish dynamic web pages with real time information of sports events. Database was maintained for sports facilities, teams and matches. Profiles of coaches, players and athletes were also maintained in the database.
Various prediction models were used to predict output and their accuracy compared. Supervised machine learning was used to build prediction models using various algorithms. Prediction model was built with a training dataset and the prediction model built was used to predict the output for a test dataset.
The results were submitted to Kaggle for evaluation
MapReduce job was run on Hadoop installed in AWS EC2 instance. MapReduce in Hadoop, fragments the input data and assigns the data chunks across the nodes in a Hadoop cluster for parallel processing. Hadoop was installed on Ubuntu and configured for Single Node Pseudo Distributed Mode. Given an input file with a subset of cards of a deck, a java program was written and compiled on the AWS Ubuntu server to compute the missing cards. The compiled JAR file and input file were copied to Hadoop and MapReducer job was run on the Hadoop distributed file system, to compute the missing cards. The outfut file with missing cards generated on Hadoop DFS was then copied to the local file system.
Master of Science in Computer Science
Major: Computer Science, Minor: Computer Networking and Security
Bachelor of Engineering in Information Science and Engineering
Pensions EPIC/MBOS web development: Full stack web development of front end and back end program modules. All phases of Software Development Life Cycle: Requirements, design, setup/install, implementation, test automation, documentation, deployment to Division of Pension Benefits (DPB)
Developed new features, maintained existing modules, created test plans, test cases and test reports, installed and setup development and production environment, documented flow charts for complex applications, updated design documents for system architecture and client requirements.
Implemented, tested and deployed software in production environment for client requests from Division of Pension Benefits (DPB). As an active team player, engaged in consultation with supervisor, technical discussions with peer members, communicated with client to gather requirements, and mentored a new team member.
CISCO Packet Tracer was used to simulate Network Configuration of email servers in LAN. Test Plan and Test cases were written for automation testing to ensure successful configuration.