Discussion:
[Mifos-developer] Migrating loan transactions outside of the API
James Rowe
2016-01-28 22:53:59 UTC
Permalink
Greetings everyone! I've been working with an MFI for sometime now, trying
to migrate their data from their old system on to the mifosx platform. I
have tried to migrate using the DataImportTool, but it doesn't support
everything we need to do when migrating loan transactions. So instead, I
wrote a simple python script to migrate the transaction history. It works
in small numbers, but the API seems to be extremely CPU intensive and for
larger loads we end up with max cpu usage on all of the cores. We decided
to run the migration on the largest AWS instance we could but its still
taking a long time on many CPUs Loading Image...v

I'm looking into alternative ways to import the loan transaction history
into Mifosx, but I'm struggling to figure everything out. What exactly does
mifos do on loan transactions that makes it consume so much CPU? Is there a
way to directly dump transaction history into the database and still have
everything that mifos needs to run? I currently only use the API for
transaction history because I don't know how mifos stores transactions in
the database. I'm not sure what the m_loan_arrears_aging table does, and
what I should put in each of the _derived columns in the
m_loan_arrears_aging table and the m_loan_transaction tables. I'm guessing
those are all fields that Mifosx is calculating when a transaction is made
through the API, but for a migration they don't seem necessary. Are these
fields needed? Is it safe to leave them blank during migrations or fill
them in with dummy data? I feel that even if they are required, once I can
figure out what these fields stand for, I can figure out what is done on a
per transaction basis, then I can roll up transactions in script, and write
out the transactions to the database. Since its a migration, I know the
final state of each loan that will be migrated, so I imagine that if I skip
the recalculation every time, I can finish the migration much much faster.
Any advice would be greatly appreciated.

Thanks,
James
Ed Cable
2016-01-28 22:56:47 UTC
Permalink
James,

Have you looked into using an ETL-based tool to build a migration tool
specific to migrating from one system to Mifos X like Nayan and Conflux had
built for Mifos 2 to Mifos X migrations?

https://github.com/openMF/move-to-mifosx
<https://web.chilipiper.com/link/mifos.org/56aa9c5ae4b02bbe46267a29?link=aHR0cHMlM0ElMkYlMkZnaXRodWIuY29tJTJGb3Blbk1GJTJGbW92ZS10by1taWZvc3g=>

Cheers,

Ed
Post by James Rowe
Greetings everyone! I've been working with an MFI for sometime now, trying
to migrate their data from their old system on to the mifosx platform. I
have tried to migrate using the DataImportTool, but it doesn't support
everything we need to do when migrating loan transactions. So instead, I
wrote a simple python script to migrate the transaction history. It works
in small numbers, but the API seems to be extremely CPU intensive and for
larger loads we end up with max cpu usage on all of the cores. We decided
to run the migration on the largest AWS instance we could but its still
taking a long time on many CPUs http://i.imgur.com/9lGO7eI.gifv
I'm looking into alternative ways to import the loan transaction history
into Mifosx, but I'm struggling to figure everything out. What exactly does
mifos do on loan transactions that makes it consume so much CPU? Is there a
way to directly dump transaction history into the database and still have
everything that mifos needs to run? I currently only use the API for
transaction history because I don't know how mifos stores transactions in
the database. I'm not sure what the m_loan_arrears_aging table does, and
what I should put in each of the _derived columns in the
m_loan_arrears_aging table and the m_loan_transaction tables. I'm guessing
those are all fields that Mifosx is calculating when a transaction is made
through the API, but for a migration they don't seem necessary. Are these
fields needed? Is it safe to leave them blank during migrations or fill
them in with dummy data? I feel that even if they are required, once I can
figure out what these fields stand for, I can figure out what is done on a
per transaction basis, then I can roll up transactions in script, and write
out the transactions to the database. Since its a migration, I know the
final state of each loan that will be migrated, so I imagine that if I skip
the recalculation every time, I can finish the migration much much faster.
Any advice would be greatly appreciated.
Thanks,
James
------------------------------------------------------------------------------
Site24x7 APM Insight: Get Deep Visibility into Application Performance
APM + Mobile APM + RUM: Monitor 3 App instances at just $35/Month
Monitor end-to-end web transactions and take corrective actions now
Troubleshoot faster and improve end-user experience. Signup Now!
http://pubads.g.doubleclick.net/gampad/clk?id=267308311&iu=/4140
Mifos-developer mailing list
https://lists.sourceforge.net/lists/listinfo/mifos-developer
--
*Ed Cable*
Director of Community Programs, Mifos Initiative
***@mifos.org | Skype: edcable | Mobile: +1.484.477.8649

*Collectively Creating a World of 3 Billion Maries | *http://mifos.org
<http://facebook.com/mifos> <http://www.twitter.com/mifos>
Sander van der Heyden
2016-01-29 13:09:12 UTC
Permalink
Hi James,

We've also done various migrations through PDI with similar jobs to the one
mentioned above, however the key thing to be aware of when using these
tasks created by Nayan is that you don't get the accounting entries linked
to the loans. If that is a requirement then the API import will be the only
real feasible way. What we normally have done so far for almost all
migrations (including ones with 1,5 million loan repayments) where
accounting is a requirement is:
- Always disable all jobs before you start a migration!
- Create all via API
- Update their status to approved via 1 SQL (to avoid another round of API
calls, and no updates are made in other tables)
- Disburse via API calls
- post loan transactions via API calls (make sure that they are sorted by
date, with the most recent payment coming last, MifosX will reprocess all
previous entries if you backdate before a previous entry)

When posting via the API we used a PHP script and curl_multi_* functions to
do multithreaded entries. Depending on how well you scale your Amazon DB
instance and Tomcat instance we were able to run 1,5m entries in 7 hours or
so including all journals etc. Ultimately if you can get away from the
accounting and are allowed to solve by posting a bulk entry in the journals
after migration, then I would definitely go for Nayan's approach as it'll
save you a hell of a lot of time.

If you are dealing with interest recalculation loans icm data migration
also let us know, as we've also done some tricks to speed that up
significantly.

Thanks,
Sander



Sander van der Heyden

CTO Musoni Services




Mobile (NL): +31 (0)6 14239505
Mobile (Kenya): +254 (0)707211284
Skype: s.vdheyden
Website: musonisystem.com
Follow us on Twitter! <https://twitter.com/musonimfi>
Postal address: Hillegomstraat 12-14, office 1.11, 1058 LS, Amsterdam,
The Netherlands
Post by Ed Cable
James,
Have you looked into using an ETL-based tool to build a migration tool
specific to migrating from one system to Mifos X like Nayan and Conflux had
built for Mifos 2 to Mifos X migrations?
https://github.com/openMF/move-to-mifosx
<https://web.chilipiper.com/link/mifos.org/56aa9c5ae4b02bbe46267a29?link=aHR0cHMlM0ElMkYlMkZnaXRodWIuY29tJTJGb3Blbk1GJTJGbW92ZS10by1taWZvc3g=>
Cheers,
Ed
Post by James Rowe
Greetings everyone! I've been working with an MFI for sometime now,
trying to migrate their data from their old system on to the mifosx
platform. I have tried to migrate using the DataImportTool, but it doesn't
support everything we need to do when migrating loan transactions. So
instead, I wrote a simple python script to migrate the transaction history.
It works in small numbers, but the API seems to be extremely CPU intensive
and for larger loads we end up with max cpu usage on all of the cores. We
decided to run the migration on the largest AWS instance we could but its
still taking a long time on many CPUs http://i.imgur.com/9lGO7eI.gifv
I'm looking into alternative ways to import the loan transaction history
into Mifosx, but I'm struggling to figure everything out. What exactly does
mifos do on loan transactions that makes it consume so much CPU? Is there a
way to directly dump transaction history into the database and still have
everything that mifos needs to run? I currently only use the API for
transaction history because I don't know how mifos stores transactions in
the database. I'm not sure what the m_loan_arrears_aging table does, and
what I should put in each of the _derived columns in the
m_loan_arrears_aging table and the m_loan_transaction tables. I'm guessing
those are all fields that Mifosx is calculating when a transaction is made
through the API, but for a migration they don't seem necessary. Are these
fields needed? Is it safe to leave them blank during migrations or fill
them in with dummy data? I feel that even if they are required, once I can
figure out what these fields stand for, I can figure out what is done on a
per transaction basis, then I can roll up transactions in script, and write
out the transactions to the database. Since its a migration, I know the
final state of each loan that will be migrated, so I imagine that if I skip
the recalculation every time, I can finish the migration much much faster.
Any advice would be greatly appreciated.
Thanks,
James
------------------------------------------------------------------------------
Site24x7 APM Insight: Get Deep Visibility into Application Performance
APM + Mobile APM + RUM: Monitor 3 App instances at just $35/Month
Monitor end-to-end web transactions and take corrective actions now
Troubleshoot faster and improve end-user experience. Signup Now!
http://pubads.g.doubleclick.net/gampad/clk?id=267308311&iu=/4140
Mifos-developer mailing list
https://lists.sourceforge.net/lists/listinfo/mifos-developer
--
*Ed Cable*
Director of Community Programs, Mifos Initiative
*Collectively Creating a World of 3 Billion Maries | *http://mifos.org
<http://facebook.com/mifos> <http://www.twitter.com/mifos>
------------------------------------------------------------------------------
Site24x7 APM Insight: Get Deep Visibility into Application Performance
APM + Mobile APM + RUM: Monitor 3 App instances at just $35/Month
Monitor end-to-end web transactions and take corrective actions now
Troubleshoot faster and improve end-user experience. Signup Now!
http://pubads.g.doubleclick.net/gampad/clk?id=267308311&iu=/4140
Mifos-developer mailing list
https://lists.sourceforge.net/lists/listinfo/mifos-developer
Loading...