Batch Engine API Basics - Importing Data
Liferay’s Headless Batch Engine provides REST APIs to import and export data. Call these services to import data to Liferay.
Importing Data
Start a new Liferay DXP instance by running
Sign in to Liferay at http://localhost:8080 using the email address test@liferay.com and the password test. When prompted, change the password to learn.
Then follow these steps:
-
Download and unzip Batch Engine API Basics.
-
To import data, you must have the fully qualified class name of the entity you are importing. You can get the class name from the API explorer in your installation at
/o/api
. Scroll down to the Schemas section and note down thex-class-name
field of the entity you want to import. -
Use the following cURL script to import accounts to your Liferay instance. On the command line, navigate to the
curl
folder. Execute theImportTask_POST_ToInstance.sh
script with the fully qualified class name of Account as a parameter.The JSON response shows the creation of a new import task. Note the
id
of the task: -
The current
executeStatus
isINITIAL
. It denotes the submission of a task to the Batch Engine. You must wait until this isCOMPLETED
to verify the data. On the command line, execute theImportTask_GET_ById.sh
script and replace1234
with the ID of your import task.If the
executeStatus
isCOMPLETED
, you can verify the imported data. If not, execute the command again to ensure the task has finished execution. If theexecuteStatus
showsFAILED
, check theerrorMessage
field to understand what went wrong. -
Verify the imported data by opening the Global Menu (
), and navigating to Control Panel → Accounts. See that two new accounts have been added.
-
You can also call the The REST service using the Java client. Navigate out of the
curl
folder and into thejava
folder. Compile the source files: -
Run the
ImportTask_POST_ToInstance
class. Replaceable
with the fully qualified name of the class andbaker
with the JSON data you want to import.For example, import
Account
data:Note the
id
of the import task from the JSON response. -
Run the
ImportTask_GET_ById
class. Replace1234
with the ID of your import task. Once theexecuteStatus
showsCOMPLETED
, you can verify the data as shown in the steps above.
Examine the cURL Command
The ImportTask_POST_ToInstance.sh
script calls the REST service using cURL.
Here are the command’s arguments:
Arguments | Description |
---|---|
-H "Content-Type: application/json" | Indicates that the request body format is JSON. |
-X POST | The HTTP method to invoke at the specified endpoint |
"http://localhost:8080/o/headless-batch-engine/v1.0/import-task/${1}" | The REST service endpoint |
-d "[{\"name\": \"Able\", \"type\": \"business\"}, {\"name\": \"Baker\", \"type\": \"guest\"}]" | The data you are requesting to post |
-u "test@liferay.com:learn" | Basic authentication credentials |
Basic authentication is used here for demonstration purposes. For production, you should authorize users via OAuth2. See Use OAuth2 to authorize users for a sample React application that uses Oauth2.
The other cURL commands use similar JSON arguments.
Examine the Java Class
The ImportTask_POST_ToInstance.java
class imports data by calling the Batch Engine related service.
This class invokes the REST service using only three lines of code:
Line (abbreviated) | Description |
---|---|
ImportTaskResource.Builder builder = ... | Gets a Builder for generating a ImportTaskResource service instance. |
ImportTaskResource importTaskResource = builder.authentication(...).build(); | Specifies basic authentication and generates a ImportTaskResource service instance. |
importTaskResource.postImportTask(...); | Calls the importTaskResource.postImportTask method and passes the data to post. |
Note that the project includes the com.liferay.headless.batch.engine.client.jar
file as a dependency. You can find client JAR dependency information for all REST applications in the API explorer in your installation at /o/api
.
The main
method’s comment demonstrates running the class.
The other example Java classes are similar to this one, but call different ImportTaskResource
methods.
See ImportTaskResource for service details.
Below are examples of calling other Batch Engine import REST services using cURL and Java.
Get the ImportTask Status
You can get the status of an import task by executing the following cURL or Java command. Replace 1234
with the ID of your import task.
ImportTask_GET_ById.sh
Command:
Code:
ImportTask_GET_ById.java
Run the ImportTask_GET_ById
class. Replace 1234
with the ID of your import task.
Command:
Code:
Importing Data to a Site
You can import data to a site by executing the following cURL or Java command. The example imports blog posts to a site. Find your Site’s ID and replace 1234
with it. When using another entity, you must also update the fully qualified class name parameter and the data to import in the cURL script.
ImportTask_POST_ToSite.sh
Command:
Code:
ImportTask_POST_ToSite.java
Run the ImportTask_POST_ToSite
class. Replace 1234
with your site’s ID, able
with the fully qualified name of the class, and baker
with the JSON data you want to import.
Command:
For example, import BlogPosting
data:
Code:
The JSON response displays information from the newly created import task. Note the id
to keep track of its executeStatus
.
Put the Imported Data
You can use the following cURL or Java command to completely overwrite existing data using the Batch Engine. The example shows updating existing account data. When using another entity, you must update the fully qualified class name parameter and the data to overwrite in the cURL script.
ImportTask_PUT_ById.sh
Command:
Code:
ImportTask_PUT_ById.java
Run the ImportTask_PUT_ById
class. Replace able
with the fully qualified name of the class, and baker
with the JSON data to overwrite what’s there. The data should contain the IDs of the entity you want to overwrite.
Command:
For instance, if you want to overwrite existing Account
data, replace 1234
and 5678
with the IDs of the existing Accounts:
Code:
Delete the Imported Data
You can use the following cURL or Java command to delete existing data using the Batch Engine. The example deletes account data. When using another entity, you must update the fully qualified class name parameter and also the data to delete in the cURL script.
ImportTask_DELETE_ById.sh
Command:
Code:
ImportTask_DELETE_ById.java
Run the ImportTask_DELETE_ById
class. Replace able
with the fully qualified name of the class, and baker
with the JSON data to overwrite what’s there. The data should contain the IDs of the entity you want to delete.
Command:
For instance, if you want to delete Account
data, replace 1234
and 5678
with the IDs of the existing accounts:
Code:
Get Contents of the Imported Data
You can retrieve the data you imported with the following cURL and Java commands. Replace 1234
with the import task’s ID. It is then downloaded as a .zip
file in the current directory.
ImportTaskContent_GET_ById.sh
Command:
Code:
ImportTaskContent_GET_ById.java
Command
Code:
The API Explorer lists all of the Headless Batch Engine services and schemas and has an interface to try out each service.
Importing Data from a Batch Export File
You can import data that you exported and downloaded via the batch engine after you extract it.
Once you have the JSON file containing the data you want to import into your Liferay instance, run a command with the --data
parameter and an @
in front of the path to the file (e.g., --data @export.json
).
For example, this command imports accounts from a batch export, from a file in a folder called export1
:
Importing Custom Objects
To import custom object entries, you must provide the ObjectEntry
class name in the URL and pass the object’s name in the taskItemDelegateName
query parameter.
For example, this command imports a single entry for the instance-scoped Able
object (with a name
and number
field):
Keeping Original Creator Data for Objects
Liferay DXP 2025.Q1+/Portal GA132+
By default, when you import object data with the batch engine API, the imported object entries have newly created metadata, including the creator information. This makes the user who authenticated the import the new owner of every object.
However, you can change this behavior to keep the originally exported creator data by adding the importCreatorStrategy
parameter at the end of the URL. This parameter has two valid values: KEEP_CREATOR
and OVERWRITE_CREATOR
(the default value).
For example, this command imports data from a file for the instance-scoped Able
object while keeping the original creator:
If you import object data with the KEEP_CREATOR
strategy but the user referenced in the imported data is not in the target Liferay instance, the parameter is ignored and the OVERWRITE_CREATOR
strategy is used instead.