SMART External File Storage - FTP Connector
The SMART External File Storage - FTP Connector extension for Microsoft Dynamics 365 Business Central provides connectivity to FTP/SFTP servers and other file transfer protocols through the unified API of the External File Storage Module. The extension enables a full range of file and directory operations on remote servers directly from Business Central.
Key Features
- Support for multiple protocols: SFTP, SCP, FTP, WebDAV, S3
- Secure credential storage in isolated storage
- Flexible security settings for SFTP/SSH connections
- Support for SSH private key authentication
- Option to use your own Azure Function for additional security
- Complete set of file operations: create, read, upload, delete
- Directory management: create, browse contents
- Integration with External File Storage module for unified access
FTP Account Setup
To configure a connection to an FTP/SFTP server, you need to create an account.
Creating an Account
- Click the
button to open the Search function, enter External File Accounts, and follow the corresponding link.
- Click the Add a file account action, and select SMART FTP connector.
- On the wizard page, fill in the required fields:
Basic Connection Settings
| Field | Description |
|---|---|
| Account Name | A unique name for the account to identify the connection in the system. Used for display in lists and selecting the account when working with files. |
| Azure Function Url | The URL of the Azure Function used as an intermediate gateway for performing operations with the FTP server. By default, a publicly available SMART business function is used. You can deploy your own function for additional security control, VNet integration, or private endpoints. |
| Azure Function Auth Code | Function-level authorization code for authenticating requests to the Azure Function. If left empty, anonymous authentication is used. Fill in this field if using your own secured Azure Function. |
Protocol and Address Settings
| Field | Description |
|---|---|
| Protocol | File transfer protocol to be used for the connection: - SFTP - SSH File Transfer Protocol, provides secure file transfer via SSH - SCP - Secure Copy Protocol, used for simple file copying via SSH - FTP - File Transfer Protocol, standard file transfer protocol (can be used with additional encryption) - WebDAV - Web Distributed Authoring and Versioning, protocol for working with files via HTTP/HTTPS - S3 - Amazon S3 protocol for working with object storage |
| FTP Secure | Type of secure connection for FTP protocol (available only when FTP protocol is selected): - None - unencrypted connection - Implicit - implicit encryption (FTPS), connection is encrypted from the start - Explicit - explicit encryption (FTPES), encryption is established via command after initial unsecured connection |
| FTP Address | Full address of the server to connect to. Can be specified as an IP address (e.g., 192.168.1.100) or domain name (e.g., ftp.example.com). |
| Port Number | Port for connecting to the server. Standard values: - Port 22 for SFTP/SCP - Port 21 for FTP - Port 990 for FTPS (Implicit) If your server uses a non-standard port, specify the appropriate value. |
| Base Folder | Initial directory on the server from which operations will be performed. All file paths will be relative to this directory. By default, the root directory ("/") is used. For example, if you specify "/documents", all operations will be performed relative to the documents folder. |
Authentication
| Field | Description |
|---|---|
| User Name | Username for authentication on the FTP/SFTP server. This field is stored in secure storage. |
| Password | User password for authentication. Data is stored in Business Central isolated storage and encrypted. When viewing an existing account, the password is displayed as "***". |
SSH Security Settings (for SFTP/SCP)
When using SFTP or SCP protocols, additional security parameters for SSH are available.
| Field | Description |
|---|---|
| Ssh Host Key Policy | Defines how the system will verify the SSH host key of the server when connecting: - Check - strictly verify host key match. Connection will only be allowed if the host key exactly matches the specified fingerprint. Recommended for maximum security. - Accept New - automatically accept new host key on first connection, but verify it on subsequent connections. Useful when the key fingerprint is not known in advance. - Give Up Security And Accept Any - accept any host key without verification. Not recommended for use in production environments due to security risks (man-in-the-middle attacks). |
| Ssh Host Key Fingerprint | Fingerprint of the server's SSH host key for authenticity verification. Used to protect against server impersonation attacks. Fingerprint format: algorithm:value, for example: - ssh-rsa 2048 xx:xx:xx:...:xx- ssh-ed25519 256 xx:xx:xx:...:xxYou can obtain the key fingerprint from the server administrator or using the ssh-keyscan command. Field is active only if host key policy is set to "Check". |
| Ssh Private Key | Private SSH key for key-based authentication (instead of password). OpenSSH and PuTTY PPK formats are supported. Using SSH keys provides a higher level of security compared to passwords. To upload a key, click on the field and select the private key file. The key file is stored in the database in encrypted form. If the private key is protected with a passphrase, specify it in the "Password" field. |
Other Parameters
| Field | Description |
|---|---|
| Disabled | Indicator showing whether the account is disabled. If set, the account cannot be used for file operations. This field is automatically set when creating a sandbox environment to prevent unwanted operations with production servers. |
Deploying Your Own Azure Function
By default, the extension uses a publicly available Azure Function from SMART business for performing operations with FTP servers. However, to enhance security, integrate with corporate network, or comply with internal security policies, you can deploy your own Azure Function.
When to Use Your Own Azure Function
- VNet (Virtual Private Network) integration is required
- Private Endpoints connection is needed
- Requirements for isolation and access control
- Need for advanced operation auditing
- Corporate security policies require using only own resources
Deployment Process
- On the FTP Account card, click the Deploy Azure Function button in the Actions menu.
- The Azure portal will open with a pre-populated ARM template for deployment.
- Complete the following steps in the Azure portal:
- Select Azure subscription
- Select existing or create new resource group
- Select deployment region (same region as Business Central is recommended)
- Review and confirm deployment parameters
- Start deployment and wait for completion (usually takes 5-10 minutes).
- After deployment completion:
- Go to the created Azure Function in the Azure portal
- Copy the Function URL
- Copy the Function Key (authorization code) if configured
- Return to the FTP Account card in Business Central:
- Paste the function URL into the Azure Function Url field
- Paste the authorization code into the Azure Function Auth Code field
Usage
Basic File Operations
After configuring the FTP account, it automatically integrates with the External File Storage module and becomes available for file operations:
- View directory contents - browse files and subdirectories on the FTP server
- Download files - download files from FTP server to Business Central
- Upload files - upload files from Business Central to FTP server
- Delete files - delete files on the FTP server
- Create directories - create new directories on the server
- Delete directories - delete empty directories
Integration with External File Storage Module
After setup, the FTP connector automatically registers with the External File Storage module, which enables:
- Use of a unified API for working with files regardless of storage type
- Centralized access management for file operations
- Single interface for working with different storage sources (FTP, Azure Blob Storage, SharePoint, etc.)
- Standardized file operations across different Business Central modules
- Ability to easily change file storage location without changing business logic
Using from Custom AL Code
This section explains how to work with the FTP connector programmatically from your own AL extensions.
File Scenario Setup
Before using the API, you must link your FTP account to a file scenario:
- Open File Scenario Setup (search for it in Business Central).
- Click New and select your scenario from the Scenario column.
- In the Connector column select SMART FTP.
- In the Account Name column select the FTP account you configured earlier.
The scenario is a logical name your AL code uses to reference the file account. This decouples business logic from the physical account — you can point the scenario to a different account without changing any code.
To register a custom scenario, add an enum extension to "File Scenario" in your AL project:
enumextension 50100 "My File Scenarios" extends "File Scenario"
{
value(50100; "EDI Provider")
{
Caption = 'EDI Provider';
}
}
Path Resolution
All paths passed to the API are relative to the Base Folder configured on the FTP account. The connector prepends the base folder automatically before sending any request to the server.
| Setting | Value |
|---|---|
| Base Folder (account card) | /edi/ |
| Path in AL code | /inbox/ |
| Actual path on the server | /edi/inbox/ |
This means your AL code never contains hard-coded server roots. Changing the root folder requires only one configuration change on the account card, not a code change.
Use ExternalFileStorage.CombinePath() when building paths dynamically — it handles trailing and leading slashes correctly across all connectors.
Reading Files from Inbox and Moving to Processed
A typical integration pattern: list files in an inbox folder, download each one, process it, then move it to a processed folder so it is not picked up again.
procedure ProcessInboxFiles()
var
TempFileAccountContent: Record "File Account Content" temporary;
ExternalFileStorage: Codeunit "External File Storage";
FilePaginationData: Codeunit "File Pagination Data";
FileInStr: InStream;
SourcePath: Text;
DestinationPath: Text;
begin
// Initialize with the scenario. BC resolves the linked account from File Scenario Setup.
ExternalFileStorage.Initialize(Enum::"File Scenario"::"EDI Provider");
// Verify the required folders exist before iterating.
if not ExternalFileStorage.DirectoryExists('/inbox/') then
Error('Inbox folder does not exist.');
if not ExternalFileStorage.DirectoryExists('/processed/') then
Error('Processed folder does not exist.');
ExternalFileStorage.ListFiles('/inbox/', FilePaginationData, TempFileAccountContent);
if not TempFileAccountContent.FindSet() then
exit;
repeat
SourcePath := ExternalFileStorage.CombinePath(
TempFileAccountContent."Parent Directory",
TempFileAccountContent.Name);
// GetFile is a [TryFunction]: returns false on failure instead of throwing.
// This ensures one bad file does not abort the entire batch.
if ExternalFileStorage.GetFile(SourcePath, FileInStr) then begin
// Process the file content here (parse XML, JSON, etc.)
DestinationPath := ExternalFileStorage.CombinePath('/processed/', TempFileAccountContent.Name);
ExternalFileStorage.MoveFile(SourcePath, DestinationPath);
end;
until TempFileAccountContent.Next() = 0;
end;
Creating and Uploading a File to Outbox
To produce a file and send it to the FTP server, write content into a Temp Blob and pass it to CreateFile. Temp Blob is the standard BC mechanism for building binary or text content in memory before passing it to an InStream-based API.
procedure UploadToOutbox(FileName: Text; var ContentTempBlob: Codeunit "Temp Blob")
var
ExternalFileStorage: Codeunit "External File Storage";
ContentInStr: InStream;
FilePath: Text;
begin
ExternalFileStorage.Initialize(Enum::"File Scenario"::"EDI Provider");
FilePath := ExternalFileStorage.CombinePath('/outbox/', FileName);
// Convert the blob to an InStream for upload.
ContentTempBlob.CreateInStream(ContentInStr, TextEncoding::UTF8);
if not ExternalFileStorage.CreateFile(FilePath, ContentInStr) then
Error('Failed to upload "%1" to outbox.', FileName);
end;
Error Handling
The following methods are declared as [TryFunction], meaning they return false on failure instead of throwing a runtime error:
| Method | Behaviour on failure |
|---|---|
GetFile | Returns false. File remains in place. |
CreateFile | Returns false. No file is created on the server. |
MoveFile | Returns false. Source file is not moved. |
CopyFile | Returns false. No copy is created. |
DeleteFile | Returns false. File is not deleted. |
ListDirectories | Returns false. Result record is empty. |
CreateDirectory | Returns false. Directory is not created. |
DeleteDirectory | Returns false. Directory is not deleted. |
DirectoryExists and FileExists are not TryFunctions — they throw on unexpected errors (e.g. authentication failure) but return false when the path simply does not exist.
Always check the return value of TryFunctions explicitly:
if not ExternalFileStorage.CreateFile(FilePath, ContentInStr) then
Error('Upload failed for "%1". Check the FTP account configuration.', FileName);
For batch operations (e.g. processing multiple files), use the false return to skip a single file without aborting the loop — see the inbox example above.