Pull all tickets via the ticket api

Hi Everyone, I am brand new to Freshdesk and was hoping I could get some help. I’m trying to create an api within Python that will grab all of our tickets out of freshdesk. However, the script i’ve written so far just grabs around 30 records. I’m expecting several thousand.

Can someone let me know what I’m doing wrong?

import requests
import json
import pandas as pd
from pyspark.sql import SparkSession
from datetime import datetime, timedelta
from requests.auth import HTTPBasicAuth

Function to get tickets incrementally with pagination

def get_incremental_tickets(updated_since):
url = f’https://{FRESHDESK_DOMAIN}.freshdesk.com/api/v2/tickets’
headers = {
‘Content-Type’: ‘application/json’
}
params = {
‘updated_since’: updated_since,
‘per_page’: 100 # Set the number of tickets per page to 100 (maximum allowed by Freshdesk)
}
all_tickets =
page = 1

while True:
    params['page'] = page
    response = requests.get(url, headers=headers, params=params, auth=(API_KEY, 'X'))
    
    if response.status_code == 200:
        tickets = response.json()
        if not tickets:
            break
        all_tickets.extend(tickets)
        page += 1
    else:
        print(f'Error: {response.status_code}')
        break

return all_tickets

if name == ‘main’:
# Fetch tickets updated since 2023
updated_since = ‘2023-01-01T00:00:00Z’
tickets = get_incremental_tickets(updated_since)

if tickets:
    print('Found Tickets')
    
    # Normalize the JSON data to include custom fields as columns
    df = pd.json_normalize(tickets)

    df['responder_id'] = df['responder_id'].fillna(0).astype('int64')

    # Save DataFrame to a Parquet file
    parquet_table_name = f"{table_name}.parquet"
    parquet_file_path = parquet_file_path_base + parquet_table_name
    df.to_parquet(parquet_file_path, engine='pyarrow', index=False)
    print(f'Tickets saved to {parquet_file_path}')
    delta_table_name = f"{schema.lower()}.{table_name}"
    print(delta_table_name)
    
    # Load parquet file to data lake table
    spark = SparkSession.builder.appName("FabricNotebook").getOrCreate()
    df_spark = spark.read.parquet(parquet_file_path)

    # Write the Spark DataFrame to the Delta table
    df_spark.write.mode("overwrite").option("overwriteSchema", "true").format("delta").saveAsTable(delta_table_name)
    print(f'Tickets saved to Delta table {delta_table_name}')

else:
    print('No tickets found or an error occurred.')

Thanks @Dino_D , In looking at your file I noticed there’s an api key and a domain in it. Not sure if you meant to do that but if not you may want to remove an repost!

I have undertaken this endeavor myself for a couple of customers. This is the script that I have written. It asks a few simple questions about your tenant and then pulls back the data into a CSV file.
FreshdeskAPIendpoints.py.txt (2.0 KB)