Welcome to ExamTopics
ExamTopics Logo
- Expert Verified, Online, Free.

Unlimited Access

Get Unlimited Contributor Access to the all ExamTopics Exams!
Take advantage of PDF Files for 1000+ Exams along with community discussions and pass IT Certification Exams Easily.

Exam MCIA - Level 1 topic 1 question 75 discussion

Actual exam question from Mulesoft's MCIA - Level 1
Question #: 75
Topic #: 1
[All MCIA - Level 1 Questions]

A Mule application is being designed to receive nightly a CSV file containing millions of records from an external vendor over SFTP. The records from the file need to be validated, transformed, and then written to a database. Records can be inserted into the database in any order.
In this use case, what combination of Mule components provides the most effective and performant way to write these records to the database?

  • A. Use a Batch Job scope to bulk insert records into the database
  • B. Use a Scatter-Gather to bulk insert records into the database
  • C. Use a Parallel For Each scope to insert records one by one into the database
  • D. Use a DataWeave map operation and an Async scope to insert records one by one into the database
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?) , you can switch to a simple comment.
Switch to a voting comment New
Alandt
2 weeks, 5 days ago
Selected Answer: A
A is correct according to official practice exam
upvoted 1 times
...
madgeezer
1 year, 11 months ago
Selected Answer: A
A. Use a Batch Job scope to bulk insert records into the database Reliability: If you want reliability while processing the records, i.e should the processing survive a runtime crash or other unhappy scenarios, and when restarted process all the remaining records, if yes then go for batch as it uses persistent queues. Memory footprint: Since question said that there are millions of records to process, parallel for each will aggregate all the processed records at the end and can possibly cause Out Of Memory. Batch job instead provides a BatchResult in the on complete phase where you can get the count of failures and success. For huge file processing if order is not a concern definitely go ahead with Batch Job
upvoted 2 times
...
Outdoor25
2 years, 6 months ago
Selected Answer: A
A is correct answer. Bulk insert in database is more efficient than multiple database insert.
upvoted 1 times
...
Outdoor25
2 years, 6 months ago
Should be A.
upvoted 1 times
...
KrishnVams
2 years, 9 months ago
Correct Answer: A. Use a Batch Job scope to bulk insert records into the database
upvoted 2 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
ex Want to SAVE BIG on Certification Exam Prep?
close
ex Unlock All Exams with ExamTopics Pro 75% Off
  • arrow Choose From 1000+ Exams
  • arrow Access to 10 Exams per Month
  • arrow PDF Format Available
  • arrow Inline Discussions
  • arrow No Captcha/Robot Checks
Limited Time Offer
Ends in