I´m trying to create a cloud function which main purpose is to detect when a file is uploaded into a folder in SFTP and then copy that file into a bucket in GCP. No error when I deployed the code put when I manually upload the file in SFTP nothings happens
here are the Cloud Function features:
Python 3.9
trigger: cloud storage
event: google.cloud.storage.object.v1.finalized
Main function
import paramiko
from google.cloud import storage
def main_function(event, context):
# Configuración de las variables
sftp_host = '-----------.com'
sftp_port = 22
sftp_username = '123456789'
sftp_password = '1234567891%'
sftp_directory = '/Import/TEST_DATA'
file_name = 'FILE1'
bucket_name = 'bucket_test'
# Conexión al servidor SFTP
transport = paramiko.Transport((sftp_host, sftp_port))
transport.connect(username=sftp_username, password=sftp_password)
sftp = paramiko.SFTPClient.from_transport(transport)
# Descarga del archivo desde el servidor SFTP
remote_file_path = f'{sftp_directory}/{file_name}'
local_file_path = f'/tmp/{file_name}'
sftp.get(remote_file_path, local_file_path)
# Subida del archivo a Google Cloud Storage
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(file_name)
blob.upload_from_filename(local_file_path)
# Cerrar conexión SFTP
sftp.close()
transport.close()
return 'Proceso completado'
requirements
google-cloud-bigquery==2.25.1
pysftp==0.2.9
pandas==1.4.2
fsspec==2022.5.0
gcsfs==2022.5.0
paramiko==2.8.0
google-cloud-storage==1.44.0
I would appreciate any help. Thank you