The need is straightforward: you want to know when new files appear on a remote FTP server, and you want to do something with them — download them, process them, trigger another workflow. The challenge is that FTP servers don't push notifications. They just sit there. You have to ask.
There are two ways to solve this. The first requires Terminal, shell scripting knowledge, and accepting that things will break silently. The second doesn't require any of that.
The hard way: shell scripts and cron
If you ask a sysadmin how to monitor an FTP folder, the first answer is usually some variation of lftp — a command-line FTP client — combined with a cron job or a launchd agent that runs it on a schedule.
The typical setup looks like this. First, write a shell script that uses lftp to mirror the remote folder to a local one:
#!/bin/bash
# download-from-ftp.sh
lftp -u username,password ftp.example.com << EOF
mirror --only-newer --no-recursion /remote/folder /Users/you/Downloads/ftp-watch/
quit
EOF
Then schedule it with a cron entry (crontab -e) to run every five minutes:
# Run FTP sync every 5 minutes
*/5 * * * * /Users/you/scripts/download-from-ftp.sh >> /tmp/ftp-sync.log 2>&1
This works. If everything is set up correctly and stays set up correctly, it checks the server every five minutes and downloads anything new.
But there are real problems with this approach:
- No notifications. Files land in the folder — you don't know until you check manually.
- Breaks silently. If the password changes, the server moves, or the network has a hiccup, the script fails quietly. You find out when someone asks why files didn't arrive.
- No UI. Debugging requires reading log files. Checking status requires running commands.
- Requires maintenance. macOS updates sometimes break cron or launchd configurations. Credentials must be updated in plaintext shell scripts or credential files.
- No error handling. A partial download or a server timeout just produces an error in the log — if you're reading it.
For a developer who's comfortable in the terminal and prefers code to GUI tools, this can be acceptable. For anyone else — and for workflows where reliability and feedback matter — it's the wrong tool.
The easy way: FTPull
FTPull is a macOS app that monitors remote FTP folders and downloads new files automatically. It's a GUI wrapper around the same fundamental concept — periodic polling of the server — but with all the problems of the shell script approach solved.
To start monitoring an FTP folder:
- Install FTPull and open it from your Applications folder. It adds an icon to your menu bar.
- Open Settings and add a new connection: hostname, port, username, password, protocol (FTP/SFTP/FTPS).
- Set the remote folder path — the directory on the server you want to watch.
- Set a local folder — where downloaded files should go on your Mac.
- Set the polling interval — how often FTPull checks the server. Every 1 minute for responsive monitoring; every 5 minutes if you want lighter network usage.
- Enable the connection. FTPull starts polling immediately.
That's the setup. From this point on, FTPull handles everything automatically. When a new file appears on the server, FTPull detects it at the next poll, downloads it, and sends a macOS notification.
What FTPull monitors
FTPull tracks the state of the remote folder by comparing the current file listing with what it saw on the previous poll. Anything new gets downloaded. Specifically:
- New files — files that weren't present in the previous listing
- Modified files (optional) — files whose size or modification timestamp changed
- Subdirectories — FTPull can monitor subdirectories recursively, so you don't need a separate connection for each subfolder
Extension filters
If you only care about certain file types — say, .csv files from a data feed or .pdf files from a processing system — set an extension filter. FTPull will silently skip everything else. This is useful when the remote folder also receives file types you don't need to act on.
Multiple folders, multiple servers
One of the practical advantages over shell scripts is multi-connection management. Need to monitor five different FTP servers — a client server, an internal server, a supplier's feed, and two regional variants? Add five connections in FTPull. They all run independently from a single menu bar icon, each with its own polling interval and settings.
With cron, each of those would be a separate script, a separate cron entry, and a separate log file to not-read.
Scheduling
FTPull includes a scheduling feature that restricts monitoring to certain hours and days. If files only ever arrive during business hours — from a system that runs 9-to-5 — there's no point polling at 3am on a Saturday. Set the schedule (e.g., 8am–8pm, Monday–Friday), and FTPull pauses outside those hours automatically.
This also reduces unnecessary network traffic and server load — particularly relevant if you're monitoring a server with strict rate limits or connection quotas.
When something goes wrong
This is where FTPull most clearly differs from the script approach. When a download fails:
- The file gets a red Finder tag — visible immediately in the download folder without opening the app
- An error notification is sent via macOS Notification Center
- The error is logged in FTPull's log view with the specific failure reason
- The file is retried on the next polling cycle
Compare this to a cron script: error gets appended to a log file you probably aren't reading, nothing else happens, the file doesn't get downloaded until someone notices.
Scripts require ongoing maintenance
Shell scripts and cron jobs have no UI, no notifications, and no error recovery. A credential change, a server move, or a network issue will cause them to fail silently — and you won't know until someone asks why the files never arrived.
FTPull handles all of that for you. Configure it once, and it runs quietly in the background indefinitely — with full feedback when something needs your attention.