Search Results

All Results 597
Resource Type
Applicable Versions
Deployment Approach
Capability
Feature
Auto Upgrade in DXP 7.3 and Above
Issue Starting from Liferay 7.3, new upgrades for modules won't run automatically on startup by default. Resolution The following options are available to run the upgrades we may include in fix/service packs:...
Violation of PRIMARY KEY constraint on startup after DB upgrade
Issue After successfully upgrading your DB with the db-upgrade tool you might face a similar error to the following one on fresh startup:...
Warnings get thrown while attempting to upgrade Lock_ table
Issue Several warnings get thrown, while the migration tool is trying to upgrade Lock_ table. WARN [main][UpgradeProcess:446] Attempting to upgrade table Lock_ by recreating the table due to: You have an...
Upgrade from Liferay DXP 7.2 FP5 to Liferay DXP 7.3 GA1 fails
Issue When running the DB upgrade process from Liferay DXP 7.2 FP5 to Liferay DXP 7.3 GA1, it fails due to Duplicate column name 'defaultVirtualHost':...
Groovy Script for deleting users from the database
Directly modifying the database is not advised because we are unable to provide assistance regarding the damage it might do. The suggested approach in these cases is a groovy script that uses Liferay's API....
How-To Generate a HAR file for Liferay Support
HTTP Archive (HAR) files are a format used to track information between a user's browser and site. It can be used to identify performance issues such as slow load times, page renders, or other bottlenecks. This is...
ETAG response header is missing
Issue ETAG response header is missing on a page. Environment DXP 7.1 Resolution If a page contains at least one non-cacheable portlet, - such as Asset Publisher -, the ETAG response header is not added by design. The...
Staging Publishing Takes a Long Time
Issue Publishing from Staging takes a very long time Staging performance issues when publishing content Environment Liferay Staging Resolution The most common cause of this is trying to publish large amounts of data at a time....