从s3存储桶访问指定的密钥?

问题描述 投票:0回答:2

我有一个S3桶xxx。我编写了一个lambda函数来访问s3存储桶中的数据并将这些细节写入RDS PostgreSQL实例。我可以用我的代码完成它。我在lambda函数中添加了一个触发器,用于在文件落在s3上时调用它。

但是从我的代码中我只能读取名为'sampleData.csv'的文件。考虑下面给出的代码

public class LambdaFunctionHandler implements RequestHandler<S3Event, String> {

private AmazonS3 s3 = AmazonS3ClientBuilder.standard().build();

public LambdaFunctionHandler() {}

// Test purpose only.
LambdaFunctionHandler(AmazonS3 s3) {
    this.s3 = s3;
}

@Override
public String handleRequest(S3Event event, Context context) {
    context.getLogger().log("Received event: " + event);
    String bucket = "xxx";
    String key = "SampleData.csv";




     System.out.println(key);

     try {
         S3Object response = s3.getObject(new GetObjectRequest(bucket, key));
         String contentType = response.getObjectMetadata().getContentType();
         context.getLogger().log("CONTENT TYPE: " + contentType);
      // Read the source file as text
         AmazonS3 s3Client = new AmazonS3Client();
         String body = s3Client.getObjectAsString(bucket, key);
         System.out.println("Body: " + body);
         System.out.println();
         System.out.println("Reading as stream.....");
         System.out.println();

         BufferedReader br = new BufferedReader(new InputStreamReader(response.getObjectContent()));
  // just saving the excel sheet data to the DataBase       
         String csvOutput;
         try { 
            Class.forName("org.postgresql.Driver");
            Connection con = DriverManager.getConnection("jdbc:postgresql://ENDPOINT:5432/DBNAME","USER", "PASSWORD");
            System.out.println("Connected");
            // Checking EOF
         while ((csvOutput = br.readLine()) != null) {
            String[] str = csvOutput.split(",");
            String name = str[1];
            String query = "insert into schema.tablename(name) values('"+name+"')";
            Statement statement = con.createStatement();
            statement.executeUpdate(query);

         }
         System.out.println("Inserted Successfully!!!");
         }catch (Exception ase) {
            context.getLogger().log(String.format(
                     "Error getting object %s from bucket %s. Make sure they exist and"
                     + " your bucket is in the same region as this function.", key, bucket));
            // throw ase;
         }


         return contentType;
     } catch (Exception e) {
         e.printStackTrace();
         context.getLogger().log(String.format(
             "Error getting object %s from bucket %s. Make sure they exist and"
             + " your bucket is in the same region as this function.", key, bucket));
         throw e;
     }
}

从我的代码中你可以看到我提到了key =“SampleData.csv”;有没有办法在没有指定特定文件名的情况下获取桶内的密钥?

amazon-web-services amazon-s3 s3-bucket
2个回答
0
投票

这几个链接会有所帮助。

http://docs.aws.amazon.com/AmazonS3/latest/dev/ListingKeysHierarchy.html http://docs.aws.amazon.com/AmazonS3/latest/dev/ListingObjectKeysUsingJava.html

您可以使用前缀和分隔符列出对象,以便在不传递特定文件名的情况下查找您要查找的密钥。


0
投票

如果您需要在S3上获取事件详细信息,您实际上可以将s3事件通知程序启用为lambda函数。 Refer the link您可以启用此功能,

  1. 单击存储桶中的“属性”
  2. 点击“活动”
  3. 点击“添加通知”
  4. 给出一个名称并选择事件类型(例如,放置,删除等)
  5. 必要时给出前缀和后缀,否则留空,考虑所有事件
  6. 然后'发送到'Lambda函数并提供Lambda ARN。

现在事件详细信息将发送lambda函数作为json格式。你可以从那个json中获取细节。输入将是这样的:

{ “记录”:[{ “eventVersion”: “2.0”, “EventSource的”: “AWS:S3”, “awsRegion”: “AP-南-1”, “EVENTTIME”:“2017-11-23T09:25: 54.845Z”, “eventName的”: “ObjectRemoved:删除”, “的UserIdentity”:{ “principalId”: “AWS:AIDAJASDFGZTLA6UZ7YAK”}, “requestParameters的”:{ “sourceIPAddress”: “52.95.72.70”}, “responseElements”: { “X-AMZ-请求-ID”: “A235BER45D4974E”, “X-AMZ-ID-2”: “glUK9ZyNDCjMQrgjFGH0t7Dz19eBrJeIbTCBNI + Pe9tQugeHk88zHOY90DEBcVgruB9BdU0vV8 =”}, “S3”:{ “s3SchemaVersion”: “1.0”, “configurationId”: “SNS”, “桶”:{ “名称”: “例如-bucket1”, “ownerIdentity”:{ “principalId”: “AQFXV36adJU8”}, “ARN”: “ARN:AWS:S3 :::例如-bucket1” }, “对象”:{ “键”: “SampleData.csv”, “定序器”: “005A169422CA7CDF66”}}}]}

您可以访问密钥objectname = event['Records'][0]['s3']['object']['key'](哎呀,这是用于python),然后将此信息发送到RDS。

© www.soinside.com 2019 - 2024. All rights reserved.