Online shops involve a large number of data import and export processes. Product data and price data have to be imported, order data have to be exported. The origins of the product and price data are often files from another server. The target destination of the order data is also often another server. The Transport Framework can transport files from one location of the network to another, where they can be processed.
The source code belongs to the cartridges bc_transport and bc_transport_orm. Both are part of the p_platform component set.
The UI source code is added to the according Commerce Management cartridge via an extension point.
The Transport Framework allows to perform file transports via SFTP, FTP, HTTP(S), EMAIL or Azure Storage Account File Shares.
For SFTP and FTP, pushing and pulling are possible. Due to the nature of HTTP and EMAIL, files can only be read from HTTP(S) and send by mail.
To access the Transport Framework, the user must have the access privilege Transport Manager.
A configuration of the Transport Framework is stored in the database tables TRANSPORTCONFIG and TRANSPORTCONFIG_AV.
The feature requires a DBinit of the cartridge bc_transport_orm from p_platform.
A Transport Configuration can also be created with a DBinit step or DBmigrate step.
The preparers are:
Preparers
com.intershop.component.transport.dbinit.PrepareTransportConfiguration com.intershop.component.transport.dbmigrate.AddTransportConfiguration
Each preparer requires a property file for configuration purposes. In the property file, the keys from the class TransportConstants are read and the result is written in the database. There is no parameter validation. This way, a blueprint can be created via DBmigrate. Such a blueprint can be completed later.
Examples can be found in the source code in the folder bc_transport_orm/staticfiles/cartridge/lib/com/intershop/component/transport/dbinit.
SFTP.properties
domain = inSPIRED-inTRONICS
process.type = SFTP
# Common settings
process.displayname = testSFTP
process.id = testSFTP
file.include.pattern =
file.exclude.pattern =
location.local = ${IS_SITES}\\${SITE_NAME}\\units\\${UNIT_NAME}\\impex\\export
location.archive =
# remote location settings
remote.protocol = SFTP
remote.hostname = localhost
remote.port =
location.remote = ./test
# authentication settings
authentication.method = PASSPHRASE
authentication.username = tester
authentication.password = password
authentication.keyfilepath =
# transfer settings
process.direction = PUSH
process.transferlimit =
process.delete = true
Log in to the Commerce Management application (organization = Operations).
Ensure you have the required access privilege.
Click Transport Configuration in the left navigation.
Each configuration belongs to a domain.
Select a domain and click Apply.
All available configurations for the domain are displayed in the list.
Go to the Transport Configuration list.
Select a domain and click Apply.
Enter a Name.
Select a Type (EMAIL, HTTP, FTP, SFTP, AZURE).
Click Add.
Go to the Transport Configuration list.
Select a domain and click Apply.
Mark the checkbox of the desired configuration.
You may select multiple configurations at once.
Click Delete and confirm with OK.
Go to the Transport Configuration list.
Select a domain and click Apply.
Click the link (Name column) of the desired configuration.
The configuration's detail page is displayed.
Enter all required parameters.
Depending on the selected Type of the configuration, different parameters have to be specified. However, each input field provides substantial help texts.
Click Apply.
Note
A configuration cannot be saved until all mandatory fields are filled with valid parameters. Up to this point a message is displayed stating that the configuration is invalid.
Please note: Only basic validation is performed for the input fields, e.g., if a URL is required, the system only checks whether the string entered can be parsed as a URL, but not whether the endpoint exists.
The following example shows settings for SFTP:
For a push or pull, the host and the remote location must be entered. The remote location is the path to the folder which is the source or target of a push- or pull-event.
For Blob Container support see below. This part is only valid until ICM 12.1
The following example shows settings for Azure storage and uses account name and account key for authentication:
Blob Container support
Blob storage support is available with ICM 12.2 and higher.
The following example shows settings for Azure storage.
You can choose between File Share or Blob Container. To select the file share, use file://<filesharename>. For Blob Container, use blob://<containername>.
The remote location specifies the path in the File Share or Container.
The account name and key can be found in the Azure portal:
Blob Container and Managed Identity support
Managed Identity support is available with ICM 14.0.1 and higher and only works with Blob Container.
Since ICM 14.0.1, it is possible to use the Managed Identity support. See Managed identities for Azure resources.
This enables the most secure connection to a Blob Container. The ICM needs to be deployed and configured against a Managed Identity or Workload Identity.
An example can be found on the Intershop Helm charts: GitHub: intershop / helm-charts | example-managed-identity-values.yaml
# Azure Workload Identity configuration workloadIdentity: # Enable or disable Azure Workload Identity support enabled: false # The client ID of the Azure managed identity (user-assigned identity, mandatory if workloadIdentity.enabled is true) clientId: # The tenant ID of the Azure AD tenant (optional, only required if the identity is defined inside another tenant than the K8s cluster) tenantId:
Once done, it can be used in the Transport configuration by using the radio button Managed Identity:
Assign the Managed Identity to the Blob Container with the following roles:
Storage Blob Data Contributor Storage Blob Data Reader
For testing purposes, it is recommended to use the Microsoft Azure Storage Explorer:
A transport configuration can be executed in the System Management application.
Log in to the System Management application.
Select the Domain (site) where you intend to create the schedule and click Apply.
Enter all required parameters and click Apply.
Parameter | Value |
|---|---|
Pipeline | FileTransportJob |
Startnode | Start |
Switch to the Attributes tab.
The job needs two parameters:
The TransportProcessID = Process ID of the transport configuration.
The UnitName (name of the domain).
To add an additional transport type, a class implementing com.intershop.component.transport.capi.provider.TransportProvider has to be created. This class is responsible for creating and determining existing TransportConfiguration objects and for creating the corresponding TransportExecutor which implements the technical transport of files.
com.intershop.component.transport.capi.provider.TransportProvider
public interface TransportProvider
{
/**
* The type of the transport
*/
String getType();
/**
* The name under which the created Objects are put into pipeline dictionary
*/
String getDictionaryKey();
/**
* get a business object for the given persistent object
* @param anID the process id
* @param someContext a business object context
* @param transportConfiguration the given persistent object
* @return
*/
TransportProcessConfigBO getTransportProcessConfigBO(String anID, BusinessObjectContext someContext,
TransportConfiguration transportConfiguration);
/**
* create a new business object for a transport configuration
* @param displayName the display name
* @param someContext a business object context
* @param transportConfiguration the given persistent object
* @return
*/
TransportProcessConfigBO createTransportProcessConfigBOByName(String displayName, Domain domain,
BusinessObjectContext someContext);
/**
* create a new business object for a transport configuration
* @param processID a process id
* @param someContext a business object context
* @param transportConfiguration the given persistent object
* @return
*/
TransportProcessConfigBO createTransportProcessConfigBO(String processID, Domain domain,
BusinessObjectContext someContext);
/**
* create a new transport executor to execute a file transport
* @param processID a process id
* @param someContext a business object context
* @param transportConfiguration the given persistent object
* @return
*/
TransportExecutorBO createTransportExecutorBO(String anID, BusinessObjectContext someContext,
TransportProcessConfigBO aConfig);
/**
* update a transport configuration for the special config type during dbinit / dbmigrate
* @param transportConfig the transport config to update
* @param config map of properties
*/
default void updateTransportConfiguration(TransportConfiguration transportConfig, Map<String, String> config)
{
}
}
Afterwards this class can be bound to the object graph:
AzureTransportModule
public class AzureTransportModule extends AbstractNamingModule
{
@Override
protected void configure()
{
MapBinder.newMapBinder(binder(), String.class, TransportProvider.class).addBinding(AzureTransportProvider.TYPE).to(AzureTransportProvider.class).in(Singleton.class);
}
}
The information provided in the Knowledge Base may not be applicable to all systems and situations. Intershop Communications will not be liable to any party for any direct or indirect damages resulting from the use of the Customer Support section of the Intershop Corporate Website, including, without limitation, any lost profits, business interruption, loss of programs or other data on your information handling system.