How to customize Service Logbook with import/export functionalities via JSON format

This tutorial will use the open-source Service Logbook app to show an example of the implementation of import/export functionalities of data in JSON format.

Context

Before you can start, we advise you read the following sub-sections, as it will help you to understand the context in which you will work.

Service Logbook

The open-sourced Service Logbook is an app which allows the storing of notes added by users on a MongoDB database, and which allows to export them in .csv format.

How we will be using it: this app will be used as an example to show how to use some of the SDK's functions to access your MongoDB database and manipulate data through code.

Document Store

The Document Store is a solution offered to keep data in your Cloud Function over time and test it locally.

How we will be using it: the Document Store provides a safe and mocked database that you can use to test custom code, without the need to worry about losing your data in case you deploy wrong code. The functions that we will create access the DocumentDBClient and will be used to import and export data as many times as we want before we start testing some specific code: therefore, if that code causes loss of the sample data we uploaded, we will just have to re-import it.

JSON format

The Document Store can accept and store any data in JSON format. For this reason, you have to make sure that the data you want to store in it can be converted to JSON. How we will be using it: this is the format we will export and import the data in. Further implementation can be added in case the data has to be exported in a different format, like the Service Logbook already does with the Export to CSV button.

DocumentDBClient

The DocumentDBClient can be accessed via the SDK's FunctionContext, and contains pre-made functions used to perform CRUD operations.

How we will be using it: thanks to this class, we will call the methods we will need to find, insert, update and delete data. The find_oneand findfunctions are, as mentioned in the DocumentDBClient's documentation section, inherited from PyMongo.

Use cases and needs

This guide provides an example to learn how to:

  • Use the Document Store.
  • Use the Cloud Functions SDK and its DocumentDBClient, as well as the UI Component SDK.
  • Test custom functionalities such as the extraction of data, its manipulation, deletion and more by using the tools provided by the SDKs in a safe environment.
  • Back data up by exporting it, or uploading new data in JSON format.

Before you start

These are the tools needed to follow this guide:

  • Some understanding of Python.
  • Some understanding of Svelte.
  • A code editor - this tutorial (just like the rest on the IXON Documentation Website) uses Visual Studio Code.
  • A WSL that you can access via the editor (for VS Code, check the note below).
❗️

Note for Microsoft Windows Users

If you are a Microsoft Windows user, make sure to check this article before you start with this guide.

Download the Service Logbook source code

To follow this example, you will first need to make sure the project you will be using works properly. Follow these steps:

  1. Download the original Service Logbook code via ZIP, which you can find in a link in its dedicated section of this article. Please note: do NOT clone the project! Instead, download it via ZIP to avoid future Git-related issues.

  2. Open the project in Visual Studio Code.

  3. Follow the instructions in this section of Prerequisites for development on Microsoft Windows systems and open the project in a remote folder.

  4. You should now be in the WSL environment. It is recommended to double-check that WSL: Ubuntu (or any other WSL used) is displayed on the bottom-left corner of the editor. To confirm that everything works, open a new bash terminal and check that the path is displayed like in the picture below. If it is, then run the make runcommand to install all the needed dependencies:

  5. If you open any .pyfile, you might notice some squiggly, yellow error lines under some dependency declarations. This is because you are not yet positioned inside the venv folder that you just installed. Navigate to the project's root folder and activate the virtual environment using the source command, like in this example:

    source "/mnt/c/Users/user1/Projects/cloud-functions/app-service-logbook/venv/bin/activate"

    You can double-check that the venv folder is selected by taking a look at the bottom-right corner, where you should see 3.12.3 (venv). In the terminal instead, this is what you should see:

  6. Ultimately, to start the project's Cloud function (backend) and the DocumentDB, kill the terminal you used to initially run the make run command or exit the process with CTRL + C, then run make runagain. If you see the following lines, the backend part of the project has started correctly:

    CBC_PATH=./functions ./venv/bin/python3 -m ixoncdkingress
    2025-09-22 12:06:31,000 - ixoncdkingress - INFO - Starting DocumentDB server
    2025-09-22 12:06:34,069 - ixoncdkingress - INFO - DocumentDB server started successfully
    2025-09-22 12:06:34,080 - ixoncdkingress - INFO - wsgiref listening on http://127.0.0.1:8020/
    2025-09-22 12:06:34,080 - ixoncdkingress - INFO - CBC_PATH: ./functions
  7. To simulate the project's UI Component, open a second bash terminal, navigate to the project's root folder (while also making sure you are in the (venv)) and run the command below to start the Simulator. At that point, a localhost:8000 page will automatically open in your browser:

    npx cdk simulate service-logbook

Now that you have made sure that the project is properly set up and running, you can proceed with the next steps.

Keep in mind that you will not have to create new files, but simply tweak existing ones with new functions.

❗️

IMPORTANT: persistence of the data in the Document Store

As stated in the Document Store documentation page, the data exists until the function "deletes it"; this also includes when the Terminal you are running the Cloud function's backend (where you used make run) is killed, either by exiting or with CTRL + C. Make sure to conclude your work and back up all the necessary data before you kill the terminal!

Additionally, if you have tested the uploading of a lot of data that you want to wipe away, killing the terminal and running make run again would be a quick way to clear things up and restart from scratch.

How to export data into JSON

This section is made of two parts, one explaining how to export all data, and one explaining how to use an IDs in a range so that data can be downloaded in chunks.

Part 1: Exporting all records as a JSON file

This feature allows a user to download a single JSON file containing all notes, enriched with metadata and a sequential ID for easy tracking.

Step 1.1: Create the Backend Endpoint (notes.py)

First, create a new function in notes.py called export_all to gather and format the data into JSON. A decoration will make this function callable and make it so it can be called via a service file called notes.service.ts.

  • What you will obtain: a new export_all function, decorated with @CbcContext.expose and @notes_endpoint() to make it available to the frontend.
  • Logic:
    • Import time.
    • Fetch all notes using notes_client.get().
    • Iterate through the notes to add a sequential, human-readable id field (starting from 1) to each note.
    • Package the list of notes and helpful metadata (like exported_on, exported_by, etc.) into a dictionary.
    • Return this dictionary inside a SuccessResponse.
import time

@CbcContext.expose
@notes_endpoint()
def export_all(
    context: CbcContext, notes_client: NotesClient
) -> SuccessResponse[dict]:
    """
    Gathers all notes, keeping the original '_id' and inserting a new 
    sequential 'listId' immediately after it.    
		"""
    all_notes = notes_client.get()
    notes_with_sequential_id = []
    
    for index, note in enumerate(all_notes):
        original_note_dict = note.model_dump(mode="json", by_alias=True)
        
        # 1. Create a new dictionary, starting with the original _id
        reordered_note_dict = {"_id": original_note_dict.get("_id")}
        
        # 2. Insert the new sequential 'id' as the second key
        reordered_note_dict["listId"] = index + 1
        
        # 3. Add all other keys from the original note
        reordered_note_dict.update({
            key: value for key, value in original_note_dict.items() if key != "_id"
        })

        notes_with_sequential_id.append(reordered_note_dict)

    export_data = {
        "exported_on": round(time.time() * 1000),
        "exported_by": context.user.name if context.user else "Unknown",
        "source_agent_or_asset_id": notes_client.agent_or_asset_id,
        "note_count": len(all_notes),
        "notes": notes_with_sequential_id,
    }

    return SuccessResponse(data=export_data)

Step 1.2: Add a Method to the Frontend Service (notes.services.ts)

Next, add a corresponding method in notes.service.ts to call this new backend endpoint.

  • What you will obtain: a new public method exportAll() added to the NotesService class.
  • Logic: this method simply calls the notes.export_all endpoint and returns the promise, acting as a clean bridge between the component and the backend.
public exportAll() {
  return this.client.call('notes.export_all');
}

Step 1.3: Implement the Frontend UI, Logic, Button and its tooltip (service-logbook.svelte)

Finally, you can update the Svelte component to add a button and create the logic to handle the download.

  • What you will obtain: a new "Export to JSON" button in the header.
  • Logic:
    • A new handler function, handleDownloadJsonButtonClick(), was created and placed in the component's <script> block.
    • This function calls await notesService.exportAll().
    • It checks the response for success.
    • It creates a Blob (an in-memory file) from the pretty-printed JSON data.
    • It uses the context.saveAsFile() method to trigger the browser's download dialog through a button click. This method belongs to the ComponentContext module.
  1. Add the Translation Key: in your onMount function, ensure the key 'EXPORT_TO_JSON' is in the context.translate array.
    // Inside onMount()
    onMount(() => {
        translations = context.translate(
        [
          '...', 
          'EXPORT_TO_JSON', 
          '...'
        ],
        undefined,
        { source: 'global' },	
    	);

  1. Create the Handler Function: add the handleDownloadJsonButtonClick function to your <script> block.
    async function handleDownloadJsonButtonClick(): Promise<void> {
      try {
        // This returns the platform's wrapper object
        const response = await notesService.exportAll();
    
        // The actual backend response is inside response.data
        const backendResponse = response.data;
    
        if (backendResponse && backendResponse.success) {
          // The structured export data is now in backendResponse.data
          const jsonString = JSON.stringify(backendResponse.data, null, 2);
          // Create a Blob (a file-like object) from the pretty-printed JSON string
          const data = new Blob([jsonString], { type: 'application/json' });
          
          // Generate a dynamic filename
          const fileName = `${kebabCase(deburr(agentOrAssetName ?? undefined))}_service-logbook.json`;
    
          // Use the context's built-in saveAsFile method to trigger the download
          if ('saveAsFile' in context) {
            context.saveAsFile(data, fileName);
          }
        } else {
          console.error('Failed to export JSON:', backendResponse?.message);
          // Optionally, you could show an error to the user here
        }
      } catch (error) {
        console.error('An error occurred during JSON export:', error);
      }
    }

  1. Add the HTML Button: in your HTML, locate the <div class="card-header-actions">. Place the new button inside this div and, specifically, inside the {#if !!$notes?.length} block and under the EXPORT_TO_CSV button block, as exporting is only possible when notes exist.
    <div class="card-header-actions">
      {#if !!$notes?.length}
    		... EXPORT_TO_CSV button block ...
        <button
            use:createTooltip={{ message: translations.EXPORT_TO_JSON }}
            class="icon-button"
            class:hidden={searchInputVisible}
            data-testid="service-logbook-export-json-button"
            on:click={handleDownloadJsonButtonClick}
           >
             <svg
               enable-background="new 0 0 24 24"
               height="24px"
               viewBox="0 -960 960 960"
               width="24px"
               fill="#000000"
               ><path
                d="M480-320 280-520l56-58 104 104v-326h80v326l104-104 56 58-200 200ZM240-160q-33 0-56.5-23.5T160-240v-120h80v120h480v-120h80v120q0 33-23.5 56.5T720-160H240Z"
               /></svg
              >
            </button>	
      {/if}
    	... more different buttons ...
     </div>

Part 2: Exporting Records by ID Range

This feature allows a user to download a specific slice of the notes by providing a start and end ID in a pop-up form. This can be particularly useful in case the user needs to export a lot of data: thanks to this functionality, they can do it in chunks to not overload the system.

Step 2.1: Create the Backend Endpoint for Ranged Export (notes.py)

Create another function in notes.py to handle the filtering of the IDs.

What you will obtain: a new exposed function called export_by_id_range.

Logic:

  • It accepts start_id and end_id parameters from the frontend.
  • It fetches all notes at first, as the sequential ID does not exist in the database.
  • It assigns a temporary sequential "id" to each note in memory.
  • It filters this list to keep only the notes where the "id" falls within the requested range.
  • It returns the filtered list and metadata in a SuccessResponse.
@CbcContext.expose
@notes_endpoint()
def export_by_id_range(
    context: CbcContext, notes_client: NotesClient, start_id: int, end_id: int
) -> SuccessResponse[dict]:
    """
    Exports a specific range of notes based on a sequential ID.
    """
    # 1. Get all notes, sorted chronologically
    all_notes = notes_client.get()

    # 2. Assign temporary sequential IDs to each note
    notes_with_ids = []
    for index, note in enumerate(all_notes):
        note_dict = note.model_dump(mode="json", by_alias=True)
        note_dict["id"] = index + 1
        notes_with_ids.append(note_dict)

    # 3. Filter the list to get only the notes within the requested range
    filtered_notes = [
        note for note in notes_with_ids if start_id <= note["id"] <= end_id
    ]

    # 4. Structure the data for export
    export_data = {
        "exported_on": round(time.time() * 1000),
        "exported_by": context.user.name if context.user else "Unknown",
        "source_agent_or_asset_id": notes_client.agent_or_asset_id,
        "note_count": len(filtered_notes),
        "id_range": f"{start_id}-{end_id}",
        "notes": filtered_notes,
    }

    return SuccessResponse(data=export_data)
👍

About metadata in JSON files

The exported_on, exported_by, source_agent_or_asset_is and note_count fields are metadata variables that are not visible if notes are exported as CSV. The reason why they were added to the JSON is that, since MongoDB accepts JSON files for import functionalities, we have to make sure that this data is available for reading when downloading notes in our case; the metadata is contained in the MongoDB _id field, which is a 12-byte BSON ObjectId that contains information about when and where the document was created, such as the creation's timestamp. This is an example of how a final, downloaded JSON file should look like:

{
  "exported_on": 1758205130140,
  "exported_by": "Simulated User",
  "source_agent_or_asset_id": "agent",
  "note_count": 1,
  "notes": [
    {
      "_id": "68cc14c6605f77d1d6e6cb07",
      "listId": 1,
      "user": "userId",
      "text": "<div>This is a test.</div>",
      "created_on": 1758205126761,
      "author_id": "userId",
      "author_name": "Simulated User",
      "editor_id": null,
      "editor_name": null,
      "updated_on": null,
      "subject": "  ",
      "category": null
    }
  ]
}

Step 2.2: Add a Method to the Frontend Service (notes.services.ts)

Now, you have to add another method used for the ID ranged export to the NotesService class in the notes.service.ts file.

What you will obtain: a new public method exportByIdRange() added to the NotesService class.

Logic: It calls the notes.export_by_id_range endpoint, passing the start_id and end_id as parameters.

public exportByIdRange(params: { start_id: number; end_id: number }) {
  return this.client.call('notes.export_by_id_range', params);
}

Step 2.3: Implement the Frontend Pop-up and Logic (service-logbook.svelte)

Finally, modify the Svelte component to use a pop-up for a cleaner UI.

What you will obtain: an "Export by ID range" button added to the header.

Logic:

  • A new handler function, handleOpenExportRangeDialog(), was created and placed in the component's <script> block.
  • This function calls context.openFormDialog() to show a pop-up with "From ID" and "To ID" number inputs.
  • If the user submits the form, the function takes the start_id and end_id from the result.
  • It then calls await notesService.exportByIdRange({ start_id, end_id }) with these values.
  • The rest of the logic (creating a Blob, triggering the download) is the same as the export all function.

  1. Add the Translation Key: in onMount, add the key 'EXPORT_JSON_BY_ID_RANGE' to the context.translate array.
    // Inside onMount()
    onMount(() => {
        translations = context.translate(
        [
          '...',
    			'EXPORT_TO_JSON', 
          'EXPORT_JSON_BY_ID_RANGE', 
          '...'
        ],
        undefined,
        { source: 'global' },	
    	);

  1. Create the Handler Function: add the handleOpenExportRangeDialog function to your <script> block.
    async function handleOpenExportRangeDialog(): Promise<void> {
      // 1. Open a form dialog to get the start and end IDs from the user
      const result = await context.openFormDialog({
        title: 'Export by ID Range',
        inputs: [
          {
            key: 'start_id',
            type: 'Number' as const,
            label: 'From ID:',
            required: true,
          },
          {
            key: 'end_id',
            type: 'Number' as const,
            label: 'To ID:',
            required: true,
          },
        ],
        initialValue: {
          start_id: 1,
          end_id: 10,
        },
        submitButtonText: 'Export',
      });
    
      // 2. If the user submitted the form, proceed with the download
      if (result && result.value) {
        const { start_id, end_id } = result.value;
        
        try {
          // Call the service method with the values from the dialog
          const response = await notesService.exportByIdRange({ start_id, end_id });
          const backendResponse = response.data;
    
          if (backendResponse && backendResponse.success) {
            const jsonString = JSON.stringify(backendResponse.data, null, 2);
            const data = new Blob([jsonString], { type: 'application/json' });
            const fileName = `${kebabCase(deburr(agentOrAssetName ?? undefined))}_service-logbook_notes_${start_id}-${end_id}.json`;
    
            if ('saveAsFile' in context) {
              context.saveAsFile(data, fileName);
            }
          } else {
            console.error('Failed to export JSON range:', backendResponse?.message);
          }
        } catch (error) {
          console.error('An error occurred during ranged JSON export:', error);
        }
      }
    }

  1. Add the HTML Button: place the new button inside the <div class="card-header-actions"> and within the {#if !!$notes?.length} block under the 'EXPORT_TO_JSON', button block, as exporting is only possible when notes exist in this case as well.
    <div class="card-header-actions">
      {#if !!$notes?.length}
    		... EXPORT_TO_JSON button block ...
        <button
          use:createTooltip={{ message: translations.EXPORT_JSON_BY_ID_RANGE }}
                  class="icon-button"
          class:hidden={searchInputVisible}
          data-testid="service-logbook-export-range-button"
          on:click={handleOpenExportRangeDialog}
         >
          <svg
                height="24px"
                viewBox="0 0 24 24"
                width="24px"
                fill="#000000"
                ><path d="M0 0h24v24H0V0z" fill="none" /><path
                  d="M10 18h4v-2h-4v2zM3 6v2h18V6H3zm3 7h12v-2H6v2z"
                /></svg
              >
        </button>	
      {/if}
    	... more different buttons ...
     </div>


How to import data into JSON

This part of the backend allows the system to efficiently receive and process a JSON file containing multiple notes, save them to the database, and report back with the newly created, complete records.

Step 1.1: Create a "Bulk Add" Database Function (client.py)

To handle importing multiple notes at once, a new add_many function was added to the NotesClient class in client.py. This file acts as a data access layer for the application's notes feature, and it handles all direct interactions with the database for CRUD operations. Within the file, NotesClientis the part of the application that directly communicates with the DocumentDBClient, providing APIs such as get() or add().

What you will obtain: a new method add_many in the NotesClient.

Logic:

  • Accepts a list of NoteAdd objects from the API endpoint.
  • Transforms each partial NoteAdd object into a full Note object, which automatically generates a unique _id, a created_on timestamp, and adds author information.
  • Bundles all the new note documents into a list and uses MongoDB's $each operator to push them to the database in a single, efficient operation.
  • Returns the complete list of newly created Note objects, including their new IDs and timestamps, back to the calling function.
		"""
    Takes a list of NoteAdd objects, creates full Note objects with author info,
    and adds them to the database in a single operation.
    """
    def add_many(self, notes_to_add: List[NoteAdd]) -> List[Note] | ErrorResponse[None]:
        if not notes_to_add:
            return []

        try:
            full_notes = [
                Note(
                    text=add_item.text,
                    subject=add_item.subject,
                    category=add_item.category,
                    author_id=self.user_id,
                    author_name=self.user_name,
                )
                for add_item in notes_to_add
            ]
            
            documents_to_push = [note.model_dump(by_alias=True) for note in full_notes]

            result = self.document_client.update_one(
                {"agent_or_asset_id": self.agent_or_asset_id},
                {"$push": {"notes": {"$each": documents_to_push}}}
            )

            if result.modified_count == 0:
                return ErrorResponse(message="Notes not added")

            return full_notes
        
        except Exception as e:
            return ErrorResponse(message=f"An unexpected error occurred during bulk add: {e}")

Step 1.2: Create the import API Endpoint (notes.py)

Next, a public API endpoint called import_notes was created to receive the JSON data from the frontend.

What you will obtain: a new function import_notes was decorated with @CbcContext.expose and @notes_endpoint() in notes.py to make it accessible to the frontend.

Logic:

  • Receives a raw json_data string from the frontend.
  • Parses the string into a Python dictionary and validates that it contains a list of notes.
  • Calls the notes_client.add_many() function with the validated notes.
  • Crucially, it takes the complete Note objects returned by add_many and places them in the data field of the SuccessResponse sent back to the frontend.
@CbcContext.expose
@notes_endpoint()
def import_notes(
    context: CbcContext, notes_client: NotesClient, json_data: str
) -> SuccessResponse[None] | ErrorResponse[str]:
    """
    Imports notes from a JSON string, validates them, and adds them to the database.
    """
    try:
        # 1. Parse the incoming JSON string into a Python dictionary
        data = json.loads(json_data)
        # 2. Extract the list of notes from the 'notes' key
        notes_to_import = data.get("notes", [])

        if not isinstance(notes_to_import, list):
            return ErrorResponse(message="The 'notes' field in the JSON file must be a list.")

        if not notes_to_import:
            return ErrorResponse(message="No notes found in the provided file.")

        # 3. Validate each note against the NoteAdd model, ensuring the data is safe before inserting.
        validated_notes = [NoteAdd(**note) for note in notes_to_import]

        # 4. Use the existing client method to add the notes to the database
        created_notes = notes_client.add_many(validated_notes)
        
        # 5. Handle potential errors from the client
        if isinstance(created_notes, ErrorResponse):
            return created_notes

        # 6. Return a success message with the complete, newly created notes
        return SuccessResponse(
            message=f"Successfully imported {len(created_notes)} notes.",
            data=created_notes
        )
    except json.JSONDecodeError:
        return ErrorResponse(message="Invalid JSON format. The file could not be read.")
    except ValidationError as e:
        return ErrorResponse(message=f"Data validation failed for one or more notes: {e}")
    except Exception as e:
        return ErrorResponse(message=f"An unexpected error occurred during import: {e}")

Step 1.3: Centralize Logic in the Service Layer (notes.service.ts)

The NotesService will handle the entire import process, keeping the UI component clean and simple.

What you will obtain: a new public method importNotes added to the NotesService class.

Logic:

  • Accepts the json_data string from the component.
  • Calls the notes.import_notes backend endpoint.
  • Waits for a successful response from the backend.
  • Upon success, it calls its own this.load() method. This is the key to the automatic refresh after importing the notes, as it tells the service to re-fetch the entire list of notes from the server, guaranteeing the UI is in sync.
public importNotes(json_data: string) {
    return this.client.call('notes.import_notes', { json_data }).then(response => {
      const backendResponse = response.data;
      if (backendResponse && backendResponse.success) {
        this.load();
      }
    }).catch(error => {
      console.error('Error during file import:', error);
    });
  }
}

Step 2.2: Create the User Interface and Event Handlers (service-logbook.svelte)

Finally, the Svelte component was updated to include the necessary HTML elements and JavaScript functions for user interaction.

What you will obtain: The <script> block was updated with new variables and two new functions (handleImportButtonClick and handleFileSelected), and the HTML was updated with a new button and a hidden file input.

Logic (JavaScript):

  • In the onMount function, the NotesService is initialized, and a tooltip is programmatically attached to the importButton element after it has been rendered to the DOM.
  • The handleImportButtonClick function's only job is to programmatically click the hidden file input, which opens the browser's file selection dialog.
  • The handleFileSelected function is triggered when the user chooses a file. It reads the file's content and passes it directly to notesService.importNotes(), delegating all further logic to the service.

Logic (HTML):

  • A hidden <input type="file"> is bound to the handleFileSelected function via the on:change event.
  • A visible <button> is created for the user. Its on:click event calls handleImportButtonClick, and bind:this connects it to the importButton variable for the tooltip logic.
  1. Add the Translation Key & Necessary Variables: in the <script> block, add the translation key, and declare variables for the button and the hidden file input.
    // Add these variables after the ones that have been already declared
    let importButton: HTMLButtonElement;
    let fileInput: HTMLInputElement;
    
    // Inside onMount()
    onMount(() => {
        translations = context.translate(
        [
            '...',
            'EXPORT_TO_JSON',
            'EXPORT_JSON_BY_ID_RANGE',
            'IMPORT_FROM_JSON',
            '...'
        ],
        undefined,
        { source: 'global' },	
        );
    // ...

  2. Initialize the Tooltip in onMount: inside the onMount function, add the call to create the tooltip for the import button after it has been rendered to the page.
    // Inside onMount() in service-logbook.svelte
    onMount(() => {
        // ... after translations are loaded ...
    
        createTooltip(addButton, { message: translations.ADD_NOTE });
        createTooltip(importButton, { message: translations.IMPORT_FROM_JSON });
    
        // ... rest of onMount ...
    });

  3. Create the Handler Functions: add the handleImportButtonClick and handleFileSelected functions to your <script> block.
    function handleImportButtonClick(): void {
        // Programmatically click the hidden file input to open the file dialog
        fileInput.click();
    }
    
    // This function is called when the user selects a file
    async function handleFileSelected(event: Event): Promise<void> {
        const input = event.target as HTMLInputElement;
        const file = input.files?.[0];
    
        if (!file) {
            return;
        }
    
        const reader = new FileReader();
        reader.onload = (e) => {
            const jsonContent = e.target?.result as string;
            // Call the service
            notesService.importNotes(jsonContent);
        };
    
        // Also clear the input value to allow re-importing the same file
        reader.onloadend = () => {
            if (input) {
            	input.value = '';
            }
        };
        reader.readAsText(file);
    }

  4. Add the HTML Elements: add the hidden <input> element to your component after the closed {#if !!$notes?.length} condition block, and then add the visible <button> in the header actions.
    Please note: the reason we used bind:this in for the importButton is because, just like for the addButton, the tooltip for the button has to be created in the onMount and be made available before the rest of the buttons. Those only appear when there are already some notes in the Service Logbook, so they do not need such logic to be shown properly.
    <div class="card" bind:this={rootEl} class:is-narrow={isNarrow}>  
    	<input
          type="file"
          bind:this={fileInput}
          on:change={handleFileSelected}
          style="display: none;"
          accept=".json"
      />
    	... more containers...
      <div class="card-header-actions">
    			...more stuff...
          {#if !!$notes?.length}
    			... more logic and elements ...
          {/if}
    
          <button
              bind:this={importButton}
              class="icon-button"
              class:hidden={searchInputVisible}
              data-testid="service-logbook-import-json-button"
              on:click={handleImportButtonClick}
          >
              <svg
              enable-background="new 0 0 24 24"
              height="24px"
              viewBox="0 -960 960 960"
              width="24px"
              fill="#000000"
              ><path
                  d="M440-320v-326L336-542l-56-58 200-200 200 200-56 58-104-104v326h-80ZM240-160q-33 0-56.5-23.5T160-240v-120h80v120h480v-120h80v120q0 33-23.5 56.5T720-160H240Z"
              /></svg
              >
          </button>
    
          <button bind:this={addButton} ... >
              </button>
      </div>
    </div>