Integration for big data synchronization application

Hi developer community.

We have in our team developed a custom serverless app to synchronize data between Mobile Iron and Freshservice.

What the app does, is to get all mobile devices in Mobile Iron, and then import or update assets in Freshservice. This is set to run one time each day, using the scheduling system.

The challenge we faced was that the application works fine when run from localhost, but failed when run as an installed app in freshservice.

This is due to the time limit of max 20 seconds for a response to be returned.
Our data sync takes about an hour to finish, therefore it will never work with this 20 second time limit.


Before we continue to develop a solution to this limitation, I wanted to hear if anyone here in the community have experience in running big data imports / synchronization from one system to freshservice. Here I’m thinking of a solution that needs to be run on a separate server and not as an installed app in freshservice.

I would also appreciate feedback on my idea on how we should develop this solution which is:

  1. Create a node.js app with the logic from the custom serverless app.
  2. Use a schedule system for node to run the import on specific times, reoccurring each day.
  3. Host the application on a server that can run node.js applications.
  4. Create an interface controller “custom app installed in freshservice” that can send and receive information from the application.

Do you see any trouble in building the solution like this.
Do you have any experience in similar or any dos and don’ts?

Best regards Jonathan

Your idea of having a separate server is the way forward in this scenario.

For any such long-running operations, the solution today is to introduce a middleware layer (the external server in this case) that receives data from a Freshservice app on each event, or using scheduled functions. The middleware layer can then batch the received data and call the required endpoint with the batched data periodically.

If the number of events are too many, such that they might hit rate-limits for HTTP requests, then using scheduled functions to send one HTTP request per minute makes sense. In this case, each event callback will add necessary data to data storage ($db), and the scheduled function will read for entries from the data storage and send out HTTP requests. This will also need you to build your middleware layer’s endpoint to accept multiple entries per request.

1 Like

Okay. So if i understand correctly.

I will build a custom app in freshservice that sends the request to a middleware (server with application). All scheduling logic is withing the app on the server.

I’m not sure i completely understand you suggestion.

My suggestion was that the freshservice app should work as a controller, to start and stop the sync.
Display logs for when a sync is finished.

So when you ex. click start on the freshservice controller app, it will send a request to the server application to start the scheduling. The only thing returned back to freshservice is that the app has started.
Store all logs in mongoDB and display them in the controller app.

In my opinion i don’t see why i should use freshservice app logic when everything is built on an external server.

Can you give more details on why your suggestion is suitable?

1 Like

Hi @JonathanHojtoft,

Thanks for scheduling the Office hour to discuss the solution.

You have three options to implement the solution for data synchronization.

Option: 1. The app can completely rely on the platform features and computation and run below the time limits and rate limits. However, it will not complete the synchronization of the whole data within the desired short time.
Scheduled Events can be used on the Serverless component that can run every 10 minutes or so until all the required data is synchronized. So, it will take more time depending on the data. The status of the data has to be stored and passed to the next schedule to continue. Each of the schedules will have only 20 seconds timeout.

Option: 2. Completely using an external server to decide when to sync and remember the sync status. The app will only notify the app installation and account details to the external server. This will avoid any of the platform limits as it will purely run on the external server.
App Setup Events of the Serverless app can be used to notify the external server. This method is used by some of the apps that don’t require sync status to be shown in the frontend app to the customer.

Option: 3. Using an external server to do the computation and trigger the scheduled sync every day. The external server will notify the app back through webhook after the sync is completed. In this case, the app will decide when to run the sync and know the sync status from the last trigger or the last response from the external server.
App Setup Events will be used to start the schedule and send the webhook to the external server. Scheduled Events will be used to run the scheduled sync b triggering the external server. External Events will receive the completion status from the external server and update the key-value data storage of the app. I think, @kaustavdm has suggested this option.

If you use Option 2 or 3, this Community Talk By Swedbyte - YouTube sharing their experience using an external server with the app would be helpful.
In this video, they have shown using two servers and a load balancer in front of it to take the load always to avoid missing events or delays during the downtime of a server. If your use-case doesn’t require this much reliability and can run the sync whenever the server is up and monitored for any downtime, that would be fine.

2 Likes

This topic was automatically closed 6 days after the last reply. New replies are no longer allowed.