2

we are using alpakka latest S3 connector for downloading stream but it fails with max-length error as exceeded content length limit (8388608 bytes)! You can configure this by setting `akka.http.[server|client].parsing.max-content-length` or calling `HttpEntity.withSizeLimit` before materializing the dataBytes stream. ,stream as Source (downloading file from S3) and flow which calls to AMQP server using java amqp-client library , when we stream file less that 8M it process but larger file did not process and throw error highlighted above.

  • Akka-http does not keep file in memory it streams directly from source , do we need to first keep file in memory and then stream it?

  • Is Downstream i.e AMQP which is java client libary(5.3.0) is givig
    this issue , we are using One connection and one channel for Rabbit
    MQ?

    val source = s3Client.download(bucketName, key)._1.
    via(Framing.delimiter(ByteString("\n"), Constant.BYTE_SIZE, 
    true).map(_.utf8String))
    
    val flow = Flow[String].mapAsync(
    //Calling future of posting message to rabbit MQ
    
    Future {
    //Rabbit MQ config Properties
    rabbitMQChannel.basicPublish(exchangeName, routingKey, amqpProperties, 
    message.toString.getBytes)
    }
    
    )     
    val result = source.via(flow).runWith(Sink.ignore)
    
4

0 に答える 0