Supercharging Mobile Backend Development using ngrok & OpenResty


If you are developing mobile applications, it's likely that you are getting data from a bunch of backend APIs. An API backend is a must if you want to develop for multiple platforms like Android, iOS, web etc

While developing such API backend locally, sometimes you need to access your service publicly for testing code with a mobile app to test end-to-end effects on the product. You can deploy your code on a publicly accessible server but this makes development slow, having to deploy your code on every change.

The solution could be to create a tunnel to a local server running on your machine using a tunneling software such as ngrok. However, if your backend is split into multiple services, with each service maintained by different set of developers/teams, it might not be practically possible to run all the services on your local machine. And, if you are running just one service, your mobile application might not be fully functional during testing, or you might not be able to access the component that you want to test.

In this article, we'll take a look at how we use ngrok with OpenResty in a micro-services environment to enable our developers to create, debug and test their services entirely from their local machines, without having to deploy a single line of code. Before we do that, lets see how our API architecture looks like.


We follow a service oriented architecture and all our API requests are routed through a single API gateway, which proxies requests to individual micro-services based on request path after authenticating the requests. The authentication process involves retrieving access tokens from incoming requests, looking up requester info based on tokens and adding a user header to the requests.

Simplified architecture of our backend APIs a. Simplified architecture of our backend APIs

Now, a developer working on a particular service, say Cart, needs to do the following things, in order to start developing the service locally:

  1. Route requests coming on cart service endpoints (e.g. /v2/cart) to his local machine.
  2. Route requests to rest of the services (e.g /v2/payment) to the API gateway which can either point to Production or Staging environment.

We achieve this using a combination of ngrok and OpenResty.

Ngrok is a tool to expose local servers behind NATs and firewalls to the public internet over secure tunnels.

OpenResty is a packaging of Nginx together with various useful libraries that can be used to write application servers.

Through ngrok, all incoming traffic is routed to developer’s local machine first, which is then forwarded to a locally running OpenResty server. Then using Nginx's location block syntax, a developer can specify which routes are to be served through local machine and which are to be proxied to the API Gateway. By default all requests are proxied to the gateway. Using OpenResty helps us implement the request authentication within Nginx using a bunch of Lua code.

Proxying requests through local machine using ngrok b. Proxying requests through local machine using ngrok

# Default route. Proxy everything to the gateway
location / {  
    proxy_set_header X-Forwarded-Host $server_name;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header Host $http_host;

# Proxy all requests on /v2/cart to localhost:8001
location /v2/cart {  
    proxy_pass localhost:8001;
    proxy_set_header X-Forwarded-Host $server_name;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_pass_request_headers on;
    proxy_set_header Host $http_host;

Location block syntax for routing to localhost


We have tried to keep the setup process simple by dockerizing the ngrok and OpenResty setup. During the first run, developers need to build the containers and setup some basic configuration like tunnel subdomain (e.g. Once that is done, the tunnel and server can be launched using a simple docker-compose up command. This launches ngrok client, connects to our private ngrok server, links the OpenResty container to the tunnel and opens up a simple dashboard that shows all the requests and their responses. We also provide flexibility to choose the production/staging environments during the launch.

The whole process of adding a custom route, obtaining a public URL and setting up app to use the custom URL takes less than 15 seconds.

Advantages of this approach:

  1. We can quickly test end to end workflow without having to deploy any code to staging/production server. This helps us save time and do quick iterations of our code.

  2. We can test our code with real data instead of generating fake data or stubs during development as well.

  3. This drastically speeds up fixing of our production issues. For example, if we identify a production bug in search service, then all we need to do is to point our app to a custom endpoint (, route traffic on search endpoint to our local machine, attach a debugger and debug the issue.

  4. This setup can also be used as a mock/stubs server. Instead of serving the requests through a service running locally, we can set this up to return static responses for specific endpoints. This comes handy when some API is not available at the time of developing the app. Our frontend developers also find this extremely helpful during the development phase.

  5. Quick fixes/UI issues can be tested directly from developers machines without ever deploying them in our staging environment for testing. Sometimes a developer needs a quick feedback on the change they make to their service running locally. They can just give the API URL to frontend developers, QA or anyone else who can test the changes.

  6. This can also be used as an API request/response inspector. ngrok comes bundled with a nice user interface that shows request and response parameters. It even has a support for replaying of requests.


Developing in a service oriented architecture at an acceptable speed became too difficult for us and we found our solution in ngrok, Openresty and Docker. And this has greatly improved the productivity of all our teams that directly work on the consumer facing products.

If you find this interesting, we are hiring for multiple engineering positions. Come join us.