The Problem
Your sales team lives in Salesforce, but your application relies on PostgreSQL for contact data. Manually exporting CSV files and importing them creates data staleness—contacts updated in Salesforce this morning won’t appear in your database until someone remembers to run the export. Worse, you lose audit trails, can’t track which records changed, and risk overwriting newer data with stale exports. Building a real-time sync solves this: contacts created or updated in Salesforce automatically flow into your PostgreSQL database within seconds. However, Salesforce’s API has query limits, bulk processing complexities, and field mapping challenges. You need to handle OAuth authentication, parse Salesforce’s JSON responses, map custom fields correctly, detect changes efficiently, and handle errors gracefully when API limits are hit. This tutorial provides production-ready code that syncs Salesforce contacts to PostgreSQL using webhooks and scheduled polling, with proper error handling and incremental sync capabilities.
Tech Stack & Prerequisites
- Node.js v18+ with npm
- PostgreSQL 14+ installed and running locally or remotely
- Salesforce Developer Account (free at developer.salesforce.com)
- Salesforce Connected App with OAuth 2.0 configured
- jsforce 2.0+ (Salesforce API client for Node.js)
- pg 8.11+ (PostgreSQL client for Node.js)
- dotenv for environment variables
- node-cron 3.0+ for scheduled sync jobs
- express 4.18+ for webhook endpoint
Required Salesforce Setup:
- Connected App created in Salesforce Setup
- OAuth scopes:
api,refresh_token,offline_access - Callback URL configured (e.g.,
http://localhost:3000/oauth/callback) - Security Token for your Salesforce user account
- Admin permissions to access Contact object
Required PostgreSQL Setup:
- Database created for storing contacts
- User with CREATE, INSERT, UPDATE, DELETE privileges
Step-by-Step Implementation
Step 1: Setup
Initialize the project:
mkdir salesforce-postgres-sync
cd salesforce-postgres-sync
npm init -y
npm install jsforce pg dotenv express node-cron
npm install --save-dev nodemon
Create project structure:
mkdir src config
touch src/server.js src/salesforce.js src/database.js src/sync.js
touch config/database.sql .env .gitignore
```
Your structure should be:
```
salesforce-postgres-sync/
├── src/
│ ├── server.js
│ ├── salesforce.js
│ ├── database.js
│ └── sync.js
├── config/
│ └── database.sql
├── .env
├── .gitignore
└── package.json
config/database.sql — Create the PostgreSQL schema:
-- Create contacts table matching Salesforce structure
CREATE TABLE IF NOT EXISTS salesforce_contacts (
id SERIAL PRIMARY KEY,
salesforce_id VARCHAR(18) UNIQUE NOT NULL,
first_name VARCHAR(255),
last_name VARCHAR(255),
email VARCHAR(255),
phone VARCHAR(50),
account_id VARCHAR(18),
title VARCHAR(255),
department VARCHAR(255),
created_date TIMESTAMP,
last_modified_date TIMESTAMP,
synced_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT unique_salesforce_id UNIQUE (salesforce_id)
);
-- Create index for faster lookups
CREATE INDEX idx_salesforce_id ON salesforce_contacts(salesforce_id);
CREATE INDEX idx_email ON salesforce_contacts(email);
CREATE INDEX idx_last_modified ON salesforce_contacts(last_modified_date);
-- Create sync log table for tracking
CREATE TABLE IF NOT EXISTS sync_logs (
id SERIAL PRIMARY KEY,
sync_type VARCHAR(50), -- 'full' or 'incremental'
records_processed INTEGER,
records_inserted INTEGER,
records_updated INTEGER,
errors INTEGER,
started_at TIMESTAMP,
completed_at TIMESTAMP,
status VARCHAR(50), -- 'success', 'failed', 'partial'
error_message TEXT
);
-- Create table for tracking last sync timestamp
CREATE TABLE IF NOT EXISTS sync_state (
id INTEGER PRIMARY KEY DEFAULT 1,
last_sync_timestamp TIMESTAMP,
CONSTRAINT single_row CHECK (id = 1)
);
-- Insert initial sync state
INSERT INTO sync_state (id, last_sync_timestamp)
VALUES (1, '2000-01-01 00:00:00')
ON CONFLICT (id) DO NOTHING;
Run the SQL to create tables:
psql -U your_username -d your_database -f config/database.sql
Step 2: Configuration
.env — Store credentials securely:
# Salesforce Configuration
SF_LOGIN_URL=https://login.salesforce.com
SF_USERNAME=your_salesforce_email@example.com
SF_PASSWORD=your_salesforce_password
SF_SECURITY_TOKEN=your_security_token
SF_CLIENT_ID=your_connected_app_consumer_key
SF_CLIENT_SECRET=your_connected_app_consumer_secret
# PostgreSQL Configuration
PG_HOST=localhost
PG_PORT=5432
PG_DATABASE=your_database_name
PG_USER=your_pg_username
PG_PASSWORD=your_pg_password
# Sync Configuration
SYNC_INTERVAL_MINUTES=15
ENABLE_SCHEDULED_SYNC=true
PORT=3000
Update .gitignore:
echo "node_modules/
.env
*.log" > .gitignore
package.json — Add scripts:
{
"name": "salesforce-postgres-sync",
"version": "1.0.0",
"type": "module",
"scripts": {
"start": "node src/server.js",
"dev": "nodemon src/server.js",
"sync:once": "node -e \"import('./src/sync.js').then(m => m.runFullSync())\""
}
}
Step 3: Core Logic
src/database.js — PostgreSQL connection and operations:
import pg from 'pg';
import dotenv from 'dotenv';
dotenv.config();
const { Pool } = pg;
// Create connection pool
const pool = new Pool({
host: process.env.PG_HOST,
port: process.env.PG_PORT,
database: process.env.PG_DATABASE,
user: process.env.PG_USER,
password: process.env.PG_PASSWORD,
max: 20, // Maximum number of clients in pool
idleTimeoutMillis: 30000,
connectionTimeoutMillis: 2000,
});
// Test database connection
export async function testConnection() {
try {
const client = await pool.connect();
console.log('✓ PostgreSQL connected successfully');
client.release();
return true;
} catch (error) {
console.error('✗ PostgreSQL connection failed:', error.message);
return false;
}
}
// Upsert contact (insert or update if exists)
export async function upsertContact(contact) {
const query = `
INSERT INTO salesforce_contacts (
salesforce_id, first_name, last_name, email, phone,
account_id, title, department, created_date, last_modified_date
) VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10)
ON CONFLICT (salesforce_id)
DO UPDATE SET
first_name = EXCLUDED.first_name,
last_name = EXCLUDED.last_name,
email = EXCLUDED.email,
phone = EXCLUDED.phone,
account_id = EXCLUDED.account_id,
title = EXCLUDED.title,
department = EXCLUDED.department,
last_modified_date = EXCLUDED.last_modified_date,
synced_at = CURRENT_TIMESTAMP
RETURNING id, salesforce_id, (xmax = 0) AS inserted
`;
const values = [
contact.Id,
contact.FirstName,
contact.LastName,
contact.Email,
contact.Phone,
contact.AccountId,
contact.Title,
contact.Department,
contact.CreatedDate,
contact.LastModifiedDate
];
try {
const result = await pool.query(query, values);
return {
success: true,
inserted: result.rows[0].inserted,
id: result.rows[0].id
};
} catch (error) {
console.error(`Error upserting contact ${contact.Id}:`, error.message);
return { success: false, error: error.message };
}
}
// Batch upsert contacts for better performance
export async function upsertContactsBatch(contacts) {
const client = await pool.connect();
let inserted = 0;
let updated = 0;
let errors = 0;
try {
await client.query('BEGIN');
for (const contact of contacts) {
const result = await upsertContact(contact);
if (result.success) {
if (result.inserted) {
inserted++;
} else {
updated++;
}
} else {
errors++;
}
}
await client.query('COMMIT');
return { inserted, updated, errors };
} catch (error) {
await client.query('ROLLBACK');
console.error('Batch upsert failed:', error.message);
throw error;
} finally {
client.release();
}
}
// Get last sync timestamp
export async function getLastSyncTimestamp() {
const query = 'SELECT last_sync_timestamp FROM sync_state WHERE id = 1';
const result = await pool.query(query);
return result.rows[0]?.last_sync_timestamp || new Date('2000-01-01');
}
// Update last sync timestamp
export async function updateLastSyncTimestamp(timestamp) {
const query = 'UPDATE sync_state SET last_sync_timestamp = $1 WHERE id = 1';
await pool.query(query, [timestamp]);
}
// Log sync operation
export async function logSync(syncData) {
const query = `
INSERT INTO sync_logs (
sync_type, records_processed, records_inserted, records_updated,
errors, started_at, completed_at, status, error_message
) VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9)
RETURNING id
`;
const values = [
syncData.syncType,
syncData.recordsProcessed,
syncData.recordsInserted,
syncData.recordsUpdated,
syncData.errors,
syncData.startedAt,
syncData.completedAt,
syncData.status,
syncData.errorMessage
];
const result = await pool.query(query, values);
return result.rows[0].id;
}
// Get contact count
export async function getContactCount() {
const result = await pool.query('SELECT COUNT(*) FROM salesforce_contacts');
return parseInt(result.rows[0].count);
}
export default pool;
src/salesforce.js — Salesforce API client:
import jsforce from 'jsforce';
import dotenv from 'dotenv';
dotenv.config();
let conn = null;
// Initialize Salesforce connection
export async function connectToSalesforce() {
if (conn && conn.accessToken) {
return conn;
}
conn = new jsforce.Connection({
loginUrl: process.env.SF_LOGIN_URL
});
try {
await conn.login(
process.env.SF_USERNAME,
process.env.SF_PASSWORD + process.env.SF_SECURITY_TOKEN
);
console.log('✓ Salesforce connected successfully');
console.log(` Organization ID: ${conn.userInfo.organizationId}`);
return conn;
} catch (error) {
console.error('✗ Salesforce connection failed:', error.message);
throw error;
}
}
// Fetch all contacts (full sync)
export async function fetchAllContacts() {
const connection = await connectToSalesforce();
const query = `
SELECT Id, FirstName, LastName, Email, Phone, AccountId,
Title, Department, CreatedDate, LastModifiedDate
FROM Contact
ORDER BY LastModifiedDate DESC
`;
try {
const result = await connection.query(query);
console.log(`Fetched ${result.totalSize} contacts from Salesforce`);
return result.records;
} catch (error) {
console.error('Error fetching contacts:', error.message);
throw error;
}
}
// Fetch contacts modified since timestamp (incremental sync)
export async function fetchModifiedContacts(lastSyncTime) {
const connection = await connectToSalesforce();
// Format timestamp for SOQL query (ISO format)
const formattedTime = new Date(lastSyncTime).toISOString();
const query = `
SELECT Id, FirstName, LastName, Email, Phone, AccountId,
Title, Department, CreatedDate, LastModifiedDate
FROM Contact
WHERE LastModifiedDate > ${formattedTime}
ORDER BY LastModifiedDate DESC
`;
try {
const result = await connection.query(query);
console.log(`Fetched ${result.totalSize} modified contacts since ${formattedTime}`);
return result.records;
} catch (error) {
console.error('Error fetching modified contacts:', error.message);
throw error;
}
}
// Query contacts with custom SOQL
export async function queryContacts(soqlQuery) {
const connection = await connectToSalesforce();
try {
const result = await connection.query(soqlQuery);
return result.records;
} catch (error) {
console.error('Error executing SOQL query:', error.message);
throw error;
}
}
// Fetch contacts in batches for large datasets
export async function fetchContactsInBatches(batchSize = 2000) {
const connection = await connectToSalesforce();
const allContacts = [];
const query = `
SELECT Id, FirstName, LastName, Email, Phone, AccountId,
Title, Department, CreatedDate, LastModifiedDate
FROM Contact
ORDER BY LastModifiedDate DESC
`;
try {
let result = await connection.query(query);
allContacts.push(...result.records);
// Handle pagination if more records exist
while (!result.done) {
result = await connection.queryMore(result.nextRecordsUrl);
allContacts.push(...result.records);
console.log(`Fetched batch: ${allContacts.length} contacts so far...`);
}
console.log(`Total contacts fetched: ${allContacts.length}`);
return allContacts;
} catch (error) {
console.error('Error fetching contacts in batches:', error.message);
throw error;
}
}
export { conn };
src/sync.js — Sync orchestration logic:
import {
upsertContactsBatch,
getLastSyncTimestamp,
updateLastSyncTimestamp,
logSync,
getContactCount
} from './database.js';
import {
fetchAllContacts,
fetchModifiedContacts
} from './salesforce.js';
// Run full sync (all contacts)
export async function runFullSync() {
const startTime = new Date();
console.log('\n🔄 Starting FULL sync...');
const syncLog = {
syncType: 'full',
recordsProcessed: 0,
recordsInserted: 0,
recordsUpdated: 0,
errors: 0,
startedAt: startTime,
completedAt: null,
status: 'running',
errorMessage: null
};
try {
// Fetch all contacts from Salesforce
const contacts = await fetchAllContacts();
syncLog.recordsProcessed = contacts.length;
if (contacts.length === 0) {
console.log('No contacts found in Salesforce');
syncLog.status = 'success';
syncLog.completedAt = new Date();
await logSync(syncLog);
return syncLog;
}
// Upsert to PostgreSQL in batches
const result = await upsertContactsBatch(contacts);
syncLog.recordsInserted = result.inserted;
syncLog.recordsUpdated = result.updated;
syncLog.errors = result.errors;
// Update last sync timestamp
await updateLastSyncTimestamp(new Date());
syncLog.status = result.errors > 0 ? 'partial' : 'success';
syncLog.completedAt = new Date();
const duration = (syncLog.completedAt - syncLog.startedAt) / 1000;
console.log(`✓ Full sync completed in ${duration}s`);
console.log(` Inserted: ${result.inserted}`);
console.log(` Updated: ${result.updated}`);
console.log(` Errors: ${result.errors}`);
await logSync(syncLog);
return syncLog;
} catch (error) {
syncLog.status = 'failed';
syncLog.errorMessage = error.message;
syncLog.completedAt = new Date();
console.error('✗ Full sync failed:', error.message);
await logSync(syncLog);
throw error;
}
}
// Run incremental sync (only modified contacts)
export async function runIncrementalSync() {
const startTime = new Date();
console.log('\n🔄 Starting INCREMENTAL sync...');
const syncLog = {
syncType: 'incremental',
recordsProcessed: 0,
recordsInserted: 0,
recordsUpdated: 0,
errors: 0,
startedAt: startTime,
completedAt: null,
status: 'running',
errorMessage: null
};
try {
// Get last sync timestamp
const lastSync = await getLastSyncTimestamp();
console.log(` Last sync: ${lastSync.toISOString()}`);
// Fetch only modified contacts
const contacts = await fetchModifiedContacts(lastSync);
syncLog.recordsProcessed = contacts.length;
if (contacts.length === 0) {
console.log('No modified contacts since last sync');
syncLog.status = 'success';
syncLog.completedAt = new Date();
await logSync(syncLog);
return syncLog;
}
// Upsert to PostgreSQL
const result = await upsertContactsBatch(contacts);
syncLog.recordsInserted = result.inserted;
syncLog.recordsUpdated = result.updated;
syncLog.errors = result.errors;
// Update last sync timestamp
await updateLastSyncTimestamp(new Date());
syncLog.status = result.errors > 0 ? 'partial' : 'success';
syncLog.completedAt = new Date();
const duration = (syncLog.completedAt - syncLog.startedAt) / 1000;
console.log(`✓ Incremental sync completed in ${duration}s`);
console.log(` Inserted: ${result.inserted}`);
console.log(` Updated: ${result.updated}`);
console.log(` Errors: ${result.errors}`);
await logSync(syncLog);
return syncLog;
} catch (error) {
syncLog.status = 'failed';
syncLog.errorMessage = error.message;
syncLog.completedAt = new Date();
console.error('✗ Incremental sync failed:', error.message);
await logSync(syncLog);
throw error;
}
}
// Get sync statistics
export async function getSyncStats() {
const contactCount = await getContactCount();
const lastSync = await getLastSyncTimestamp();
return {
totalContacts: contactCount,
lastSyncTime: lastSync
};
}
src/server.js — Express server with scheduled sync:
import express from 'express';
import cron from 'node-cron';
import dotenv from 'dotenv';
import { testConnection, getContactCount } from './database.js';
import { connectToSalesforce } from './salesforce.js';
import { runFullSync, runIncrementalSync, getSyncStats } from './sync.js';
dotenv.config();
const app = express();
const PORT = process.env.PORT || 3000;
app.use(express.json());
// Health check endpoint
app.get('/health', async (req, res) => {
try {
const dbConnected = await testConnection();
const sfConnected = await connectToSalesforce();
res.json({
status: 'healthy',
database: dbConnected ? 'connected' : 'disconnected',
salesforce: sfConnected ? 'connected' : 'disconnected',
timestamp: new Date().toISOString()
});
} catch (error) {
res.status(500).json({
status: 'unhealthy',
error: error.message
});
}
});
// Trigger full sync manually
app.post('/sync/full', async (req, res) => {
try {
console.log('Manual full sync triggered via API');
const result = await runFullSync();
res.json({
success: true,
result
});
} catch (error) {
res.status(500).json({
success: false,
error: error.message
});
}
});
// Trigger incremental sync manually
app.post('/sync/incremental', async (req, res) => {
try {
console.log('Manual incremental sync triggered via API');
const result = await runIncrementalSync();
res.json({
success: true,
result
});
} catch (error) {
res.status(500).json({
success: false,
error: error.message
});
}
});
// Get sync statistics
app.get('/sync/stats', async (req, res) => {
try {
const stats = await getSyncStats();
res.json(stats);
} catch (error) {
res.status(500).json({
error: error.message
});
}
});
// Start server
async function startServer() {
try {
// Test connections
await testConnection();
await connectToSalesforce();
app.listen(PORT, () => {
console.log(`\n🚀 Server running on http://localhost:${PORT}`);
console.log(`\nAvailable endpoints:`);
console.log(` GET /health - Health check`);
console.log(` POST /sync/full - Trigger full sync`);
console.log(` POST /sync/incremental - Trigger incremental sync`);
console.log(` GET /sync/stats - Get sync statistics\n`);
});
// Schedule automatic incremental sync
if (process.env.ENABLE_SCHEDULED_SYNC === 'true') {
const interval = process.env.SYNC_INTERVAL_MINUTES || 15;
const cronExpression = `*/${interval} * * * *`;
cron.schedule(cronExpression, async () => {
console.log(`\n⏰ Scheduled sync triggered (every ${interval} minutes)`);
try {
await runIncrementalSync();
} catch (error) {
console.error('Scheduled sync failed:', error.message);
}
});
console.log(`⏱️ Scheduled sync enabled (every ${interval} minutes)\n`);
}
} catch (error) {
console.error('Failed to start server:', error.message);
process.exit(1);
}
}
startServer();
Step 4: Testing
Start the sync server:
npm run dev
Test 1: Health Check
curl http://localhost:3000/health
Expected output:
{
"status": "healthy",
"database": "connected",
"salesforce": "connected",
"timestamp": "2026-02-18T10:30:00.000Z"
}
Test 2: Trigger Full Sync
curl -X POST http://localhost:3000/sync/full
Expected output:
{
"success": true,
"result": {
"syncType": "full",
"recordsProcessed": 150,
"recordsInserted": 150,
"recordsUpdated": 0,
"errors": 0,
"status": "success"
}
}
Test 3: Verify Data in PostgreSQL
psql -U your_username -d your_database
-- Check total contacts synced
SELECT COUNT(*) FROM salesforce_contacts;
-- View recent contacts
SELECT salesforce_id, first_name, last_name, email, synced_at
FROM salesforce_contacts
ORDER BY synced_at DESC
LIMIT 10;
-- Check sync logs
SELECT * FROM sync_logs ORDER BY completed_at DESC LIMIT 5;
Test 4: Trigger Incremental Sync
First, update a contact in Salesforce UI, then:
curl -X POST http://localhost:3000/sync/incremental
Test 5: Get Sync Statistics
curl http://localhost:3000/sync/stats
Test 6: Run One-Time Sync from Command Line
npm run sync:once
Testing Checklist:
- ✓ Health endpoint returns “healthy” status
- ✓ Full sync imports all Salesforce contacts
- ✓ PostgreSQL tables contain contact data
- ✓ Incremental sync detects modified contacts
- ✓ Sync logs track all operations
- ✓ Scheduled sync runs automatically (check console logs)
- ✓ Contact updates in Salesforce appear in PostgreSQL
[SCREENSHOT SUGGESTION: Salesforce Setup page showing Connected App configuration with Consumer Key and Consumer Secret]
Common Errors & Troubleshooting
Error 1: “INVALID_LOGIN: Invalid username, password, security token”
Problem: Salesforce authentication fails with invalid credentials error.
Solution: The security token must be appended to your password. To get your security token:
- Log into Salesforce
- Go to Settings → My Personal Information → Reset My Security Token
- Check your email for the new token
- In
.env, combine password + token:
SF_PASSWORD=MyPassword123
SF_SECURITY_TOKEN=AbCdEfGhIjKlMnOpQrStU
# jsforce will combine these automatically
Alternative: Use OAuth 2.0 instead of username/password flow for better security. Modify src/salesforce.js:
conn = new jsforce.Connection({
oauth2: {
loginUrl: process.env.SF_LOGIN_URL,
clientId: process.env.SF_CLIENT_ID,
clientSecret: process.env.SF_CLIENT_SECRET,
redirectUri: 'http://localhost:3000/oauth/callback'
}
});
Error 2: “QUERY_TIMEOUT: Your query request was running for too long”
Problem: SOQL queries timeout when fetching large datasets (10,000+ contacts).
Solution: Use batch processing with queryMore() for pagination. This is already implemented in fetchContactsInBatches(), but ensure you’re calling it for large datasets:
// Instead of fetchAllContacts() for >10k records, use:
import { fetchContactsInBatches } from './salesforce.js';
const contacts = await fetchContactsInBatches(2000); // 2000 per batch
Also consider: Adding a WHERE clause to filter records:
const query = `
SELECT Id, FirstName, LastName, Email
FROM Contact
WHERE CreatedDate > LAST_N_DAYS:90
ORDER BY LastModifiedDate DESC
`;
Error 3: “ERROR_DUPLICATE_ID” or Constraint Violation on Upsert
Problem: PostgreSQL throws unique constraint errors when syncing contacts.
Solution: This usually indicates corrupted sync state or duplicate Salesforce IDs. First, verify uniqueness:
-- Check for duplicate salesforce_id values
SELECT salesforce_id, COUNT(*)
FROM salesforce_contacts
GROUP BY salesforce_id
HAVING COUNT(*) > 1;
If duplicates exist, remove them:
-- Keep only the most recently synced record
DELETE FROM salesforce_contacts
WHERE id NOT IN (
SELECT MAX(id)
FROM salesforce_contacts
GROUP BY salesforce_id
);
Prevent future issues by ensuring your upsert query uses ON CONFLICT properly (already implemented in upsertContact()). If the issue persists, clear and re-sync:
TRUNCATE TABLE salesforce_contacts;
DELETE FROM sync_state WHERE id = 1;
INSERT INTO sync_state (id, last_sync_timestamp) VALUES (1, '2000-01-01');
Then run a fresh full sync via API: POST /sync/full
[DIAGRAM SUGGESTION: Flowchart showing: Salesforce Contact Modified → Incremental Sync Triggered → Query Modified Records → Upsert to PostgreSQL → Update Sync Timestamp]
Security Checklist
Protect credentials and data with these production-ready practices:
- Never commit
.envfiles — Use environment variables in production via AWS Secrets Manager, Azure Key Vault, or similar. Rotate Salesforce passwords and security tokens quarterly. - Use OAuth 2.0 instead of password flow — Username/password authentication is less secure than OAuth. Implement the OAuth flow with refresh tokens for production deployments.
- Encrypt database connections — Enable SSL for PostgreSQL connections in production:
const pool = new Pool({
ssl: {
rejectUnauthorized: true,
ca: fs.readFileSync('/path/to/ca-certificate.crt').toString()
}
});
- Implement rate limiting — Salesforce has API limits (15,000 requests per 24 hours for Developer Edition). Add rate limiting to prevent exhausting quotas:
import rateLimit from 'express-rate-limit';
const limiter = rateLimit({
windowMs: 24 * 60 * 60 * 1000, // 24 hours
max: 10000 // max requests per window
});
app.use('/sync/', limiter);
- Validate and sanitize data — Although Salesforce data is generally trusted, validate email formats and sanitize inputs before inserting:
const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
if (contact.Email && !emailRegex.test(contact.Email)) {
console.warn(`Invalid email for contact ${contact.Id}`);
contact.Email = null;
}
- Implement audit logging — The
sync_logstable tracks operations, but add user activity logging for compliance (GDPR, HIPAA):
CREATE TABLE audit_log (
id SERIAL PRIMARY KEY,
user_id VARCHAR(255),
action VARCHAR(100),
table_name VARCHAR(100),
record_id VARCHAR(18),
changes JSONB,
timestamp TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
- Use connection pooling limits — Prevent resource exhaustion by limiting PostgreSQL connections (already set to
max: 20in the example). - Secure API endpoints — Add authentication middleware to prevent unauthorized sync triggers:
app.use('/sync/', (req, res, next) => {
const apiKey = req.headers['x-api-key'];
if (apiKey !== process.env.INTERNAL_API_KEY) {
return res.status(401).json({ error: 'Unauthorized' });
}
next();
});
- Monitor for anomalies — Set up alerts for unusual activity (sudden spike in errors, mass deletions). Use PostgreSQL’s
pg_stat_statementsfor query monitoring.
Ready to Scale Your Integration?
Building robust integrations requires expertise in APIs, database design, and error handling. If you need help implementing custom Salesforce integrations, migrating legacy systems, or architecting scalable data pipelines, schedule a consultation with our team. We’ll help you build production-ready solutions that save time and eliminate manual data exports.

Maha is a Senior Software Engineer specializing in scalable web architecture and API development. She focuses on creating production-ready integration guides that help developers bridge the gap between complex backend systems and frontend usability. When she isn’t debugging, Maha explores modern dev-tools and the future of automated workflows.

