diggi.tize: showroomz architecture proposal

In the interest of time I am making a few assumptions about this app:

  • development speed is to be prioritized over cost hence:
    • the backend is run ASP.NET Core Web API running .NET 6 using C# inside a Docker container
      • depending on the initially expected load:
        • on an already rented on-premise machine (low loads)
        • Azure App Services (medium loads)
        • on Google Kubernetes Engine using auto-scaling and a load-balancer (quick and heavy loads)
    • the frontend is run Angular as frontend framework
      • depending on the initially expected load:
        • GitHub pages (completely free, average response time in certain areas)
        • Cloudflare pages ($20 per zone; super quick response due to CDN breadth)
  • the app is mainly targeting Germany and uses Euro as currency
  • companies that would like to advertise on showroomz are referred to as shops
  • companies can have multiple entries on showroomz which are referred to as offers
  • users that download the app and are swiping through offers are referred to as viewers
  • viewers only interact with the product using iOS or Android

Payment Provider

Generally, Stripe offers pretty robust support for one-time and subscription payments. However, SEPA direct debit (Lastschriftverfahren) is more prevalent than credit card and PayPal payments in Germany and, based on experience from previous projects, can have a very significant rejection rate (and result in up to a 3,00€ fee billed to whoever is issuing the payment).
Should this turn out to be an issue, Adyen (payment processor) in combination with Billwerk (subscription-manager) offers more fraud protection. Adyen uses a database of payment details and records of payments, to give an evaluation of the trustworthiness of the payment details.
Both offer WebHooks, which inform our backend services when a payment is successful, to initiate whatever the shop or viewer is paying for.

Video streaming

Given that iOS and Android require different playback codecs and formats (which might also be entirely different to the source video uploaded by shops), Azure Media Services could be utilized here. Azure Media Services only bills for storage and processing, not playback. Given that, there are two possible approaches:
for both:

  • shops would use a website to upload a video clip and their offer details
  • videos would be forwarded to Azure Media Services in their raw format
  • this URL and the offer details would be stored in an MSSQL database

then either

  • source videos would be encoded to a dynamic preset
  • videos can then be encoded and played back at the same time, paying on-demand:

or, for popular videos crossing a certain threshold:

  • a transform job is created on Azure Media Services, which takes a source video and transforms it into formats for iOS and Android:
  • each source video is ran through this job, converting it into all required formats only once
  • these new format URLs are stored inside the MSSQL database

App (iOS & Android)

Given the team’s experience using Angular in the frontend, I would highly recommend using a tool like Capacitator over tools like Flutter or React Native.
Capacitator compiles regular HTML, CSS, and JS into a native Android or iOS app. This is used so that Angular to be used as usual; local tests can be run in the browser (so no need to recompile and deploy an app to a test device for quicker iteration) and device-specific APIs like the geolocation API needed are wrapped and available via JS as well.

Multiple WebApps (SPA)

Ideally, these apps could be split into various repositories. Possible apps I see here include:

  • a system for shops to upload and manage their offers
  • an administrative interface to
    • view and manage fraudulent and prohibited content reports
    • create and manage accounts for shops
    • create and manage accounts for users
  • a status page to indicate expected maintenance and incident reports (ideally hosted entirely separate)

Background work

This depends on the kind and load of background work:
Generally, I would recommend using hosted services
These can be put in various places, based on expected load:

  • inside the ASP.NET Core API (low loads)
  • as an Azure Functions App (medium/high loads limited to one cloud provider)
  • as a containerized application (medium/high loads that can be orchestrated across various cloud providers)
    (to prepare for a move to medium/high loads, a message queue like RabbitMQ or Azure Queue Storage should be used; that way multiple instances of this background worker can be spun up and the Queue simply distributes works in a round-robin schedule so they are not handled by only one worker)

Push Notifications

Capacitator offers support for push notifications, using Google Firebase as a backend for both iOS and Android.
Building a custom notification service would require custom and platform-specific code and is only worth considering, should the costs created by Google Firebase be higher than having an in-house team maintain a custom notification platform.

Email Service Provider

Emails would be split into two categories:

Transactional Emails

These are sent for anything that is directly tied to a user action:

  • resetting their password
  • confirming their email address
  • informing them about succeeded/failed payments…

For this, I would set up an abstract singleton ASP service class, which then can be filled with specific implementations via Dependency Injection:

  • For local testing, a containerized IMAP service could be used. (While regular e-mail addresses of the user like GMail or MS Exchange could be used, this is generally inadvisable, as both failing to successfully send emails reduces the trustworthiness of the domain at various mailer exchanges and could also lead to leaked credentials.)
  • For production and development deployments SendGrid offers the ability to create layouts in their own backend (meaning changes to mail templates can be made by frontend developers without interacting with backend developers using their website) and our C# .NET code simply interacts with their API and provides the variables of the email that change (like names, their payment method, email address or password reset/email confirmation tokens)

Marketing Emails

Compared to transaction emails, marketing emails (like pro-active retention mechanisms or optional reminder emails) can profit from having more information available to the user. This might include, but isn’t limited to, products or industries the user interacts with, regions they frequent, or data like age or if a viewer’s subscription or shop’s offer is about to run out.
Mailchimp offers the ability to collect this data and then allows data analysts to quickly select specific user groups and send emails to them.
This data would be populated using a service inside the API, that aggregates data based on user interactions.

Analysis of content to flag any prohibited content

Generally, support and content moderation is best done with dedicated staff for it. Azure offers automated flagging systems, which then can be fed into a manual moderation tool, which preemptively flags content or accounts that cross a certain threshold.

Insane performance (interesting in hearing your strategies)

Based on my video game development experience perceived speed is relative and technologies like caching and pre-loading content can already go a long way:

  • Tools like Sentry should measure the duration of actions across front- and backend systems to make informed decisions on optimization

    • Deploying these tools is required for insight into how long a user waits from touch input to first rendered pixel and identify bottlenecks
  • Videos and descriptions can be pre-loaded when the app is open and new content fetched in the background when a certain threshold of remaining pre-loaded content is hit

    • If companies can create time-critical offers, these should include an expiration date; that way the client can skip irrelevant pre-loaded content before the first server-side request is finished when the app is re-opened
  • DNS servers should resolve to instances physically close to the user

  • Similarly, posts for specific regions should be stored in database mirrors (either purely SQL or in-memory tools like Redis) closer to these locations