shpock logo

Shpock

Case Study: API Development and Data Integration for Shpock

Client: Shpock
Project: API Endpoint Development, Data Scraping, and Test Coverage Enhancement
Technologies Used: PHP, Symfony, MongoDB, Doctrine, Docker, NodeJS, ElasticSearch


Overview

Shpock, a popular marketplace app, required advanced API development, improved test coverage, and data integration solutions to support and streamline its expanding digital ecosystem. This project focused on developing new API endpoints with Symfony and MongoDB, enhancing test coverage for greater code reliability, and implementing data scraping to gather crucial information for better user experience and operations.


Challenges

  • Expanding API Functionality: Shpock needed additional API endpoints to support its growing features and data requirements.
  • Limited Test Coverage: The existing codebase required more unit tests to increase reliability, reduce bugs, and ensure seamless updates.
  • Data Integration from External Sources: There was a need to integrate data from third-party listings, requiring custom scraping and storage solutions to maintain real-time information in the Shpock database.

Solutions Implemented

  1. API Endpoint Development with Symfony and MongoDB
    Using Symfony and MongoDB with Doctrine, new API endpoints were developed to support various Shpock features. These endpoints allowed the platform to handle more extensive data requests efficiently, creating a smoother experience for end users. The data model was optimised to work seamlessly with MongoDB, providing scalability and flexibility in data storage and retrieval.
  2. Unit Testing and Test Coverage Expansion
    A structured approach was taken to implement unit tests for the new API endpoints and expand testing coverage on existing code. Writing and executing these tests ensured robust code functionality, minimised the risk of bugs, and contributed to a more stable codebase, essential for Shpock’s rapid release cycles.
  3. NodeJS Web Scraper for Third-Party Data Integration
    A custom web scraper was built using NodeJS to gather listings and delivery information from a third-party website. This data was extracted and stored in MongoDB, allowing Shpock to access up-to-date delivery and listing information. This feature enabled Shpock to offer users enriched listing details and delivery options, enhancing the user experience.
  4. Collaboration with the PHP Team in Austria
    Working closely with Shpock’s PHP team based in Austria, we ensured that all developments aligned with the team’s technical standards, facilitated smooth code reviews, and integrated seamlessly into the platform’s architecture. This collaborative approach was critical to maintaining consistency and optimising team workflows.

Results

  • Enhanced API Functionality: New API endpoints streamlined data handling and increased the platform’s capability to support additional features, enhancing user experience and operational flexibility.
  • Increased Code Reliability through Unit Testing: Improved test coverage resulted in a more resilient codebase, enabling faster releases and minimising post-deployment issues.
  • Real-Time Data Integration: The NodeJS scraper ensured that Shpock had access to timely, relevant data from third-party sites, allowing for more comprehensive listing details and optimised delivery information.

Conclusion

Through targeted API development, expanded testing, and innovative data scraping, we delivered a solution that strengthened Shpock’s infrastructure and enhanced its service offerings. The added functionality, improved data flow, and stable codebase have enabled Shpock to provide users with a reliable and enriched marketplace experience.