Bug 41882 - ucs-school-import: lower memory usage of import
ucs-school-import: lower memory usage of import
Status: CLOSED FIXED
Product: UCS@school
Classification: Unclassified
Component: Import scripts
UCS@school 4.1 R2
Other Linux
: P5 enhancement (vote)
: UCS@school 4.1 R2 v11
Assigned To: Daniel Tröder
Sönke Schwardt-Krummrich
:
Depends on:
Blocks: 43936 47413
  Show dependency treegraph
 
Reported: 2016-08-01 15:04 CEST by Daniel Tröder
Modified: 2018-07-30 14:20 CEST (History)
3 users (show)

See Also:
What kind of report is it?: Bug Report
What type of bug is this?: 5: Major Usability: Impairs usability in key scenarios
Who will be affected by this bug?: 4: Will affect most installed domains
How will those affected feel about the bug?: 3: A User would likely not purchase the product
User Pain: 0.343
Enterprise Customer affected?:
School Customer affected?: Yes
ISV affected?:
Waiting Support:
Flags outvoted (downgraded) after PO Review:
Ticket number:
Bug group (optional):
Max CVSS v3 score:


Attachments
massively lower memory footprint of user import (14.69 KB, patch)
2016-10-05 20:16 CEST, Daniel Tröder
Details | Diff

Note You need to log in before you can comment on or make changes to this bug.
Description Daniel Tröder univentionstaff 2016-08-01 15:04:00 CEST
A customers system with 32GB RAM could not run an import job with 70,000 users, without adding GBs of swap.

Ideas:
* invalidate ucsschool.lib cache at certain points (e.g. after delete phase)
* remove ImportUser objects from memory after create() / modify() / delete() and keep only textual representations for the three exports of statistics, passwords and summary (or write those as we go).
* delete Exception objects and keep only textual representations (most of the Exceptions carry relevant ImportUser objects)
Comment 1 Daniel Tröder univentionstaff 2016-10-05 20:16:56 CEST
Created attachment 8069 [details]
massively lower memory footprint of user import

The attached patch massively lowers the memory footprint of the user import script. It does that mainly by avoiding to collect ImportUser objects.
It further releases created objects as early as possible and fixes the usage of some class attributes.

Each ImportUser object has a size of about 0.28 MB! (70,000 user = 19 GB RAM)
Deleting 1,000 users did also cost 280 MB RAM.

A positiv side effect is, that reading the CSV now takes 1/10th of the time (because the Users are initialized later), so errors in the CSV can be found faster.

The patch includes the code used to display the memory usage (pip install memory_profiler).

A search for memory leaks using Heapy (pip install guppy) did not yield useful results (but rather sent me on a wild hunt inside UDM).


Benchmarks:

OLD: Creation of 1000 users

Line #    Mem usage    Increment   Line Contents
================================================
    93   53.312 MiB    0.000 MiB   	@profile(precision=3)
    94                             	def import_users(self):
    95   53.312 MiB    0.000 MiB   		self.logger.info("------ Importing users... ------")
    96   54.531 MiB    1.219 MiB   		user_import = self.factory.make_user_importer(self.dry_run)
    97   63.797 MiB    9.266 MiB   		imported_users = user_import.read_input()
    98   63.832 MiB    0.035 MiB   		users_to_delete = user_import.detect_users_to_delete()
    99   63.832 MiB    0.000 MiB   		user_import.delete_users(users_to_delete)
   100  343.383 MiB  279.551 MiB   		user_import.create_and_modify_users(imported_users)

real	31m54.401s
user	23m57.452s
sys	0m11.304s

NEW: Creation of 1000 users

Line #    Mem usage    Increment   Line Contents
================================================
    95   53.496 MiB    0.000 MiB   	@profile(precision=3)
    96                             	def import_users(self):
    97   53.496 MiB    0.000 MiB   		self.logger.info("------ Importing users... ------")
    98   54.641 MiB    1.145 MiB   		user_import = self.factory.make_user_importer(self.dry_run)
    99   60.340 MiB    5.699 MiB   		imported_users = user_import.read_input()
   100   60.504 MiB    0.164 MiB   		users_to_delete = user_import.detect_users_to_delete()
   101   60.973 MiB    0.469 MiB   		user_import.delete_users(users_to_delete)
   102   63.137 MiB    2.164 MiB   		user_import.create_and_modify_users(imported_users)

real	19m5.799s
user	11m25.100s
sys	0m10.312s

NEW: Creation of 2000 users

root@sch-m-71:~/sync/ucs-school-import# time usr/share/ucs-school-import/scripts/ucs-school-testuser-import -v --teachers 1000 --students 1000 --classes 50 SchuleVier SchuleFuenf SchuleSechs SchuleSieben

Created ImportTeacher: 1000
Created ImportStudent: 1000
Errors: 0

Line #    Mem usage    Increment   Line Contents
================================================
    95   53.316 MiB    0.000 MiB        @profile(precision=3)
    96                                  def import_users(self):
    98   54.938 MiB    1.621 MiB                user_import = self.factory.make_user_importer(self.dry_run)
    99   66.957 MiB   12.020 MiB                imported_users = user_import.read_input()
   100   67.426 MiB    0.469 MiB                users_to_delete = user_import.detect_users_to_delete()
   101   67.625 MiB    0.199 MiB                user_import.delete_users(users_to_delete)
   102   73.109 MiB    5.484 MiB                user_import.create_and_modify_users(imported_users)
   113   73.109 MiB    0.000 MiB                self.logger.info("------ Importing users done. ------")

real    38m10.582s
user    23m4.436s
sys     0m21.100s

--------------------------------------------------------------------

That seems to indicate the following formula:
53MB + 10 MiB / 100 users = 53MB + 0.01 MiB / user

So for 70,000 users that'd be 753MB MiB.
Comment 2 Daniel Tröder univentionstaff 2016-10-05 20:26:06 CEST
BTW: the patch is not complete. There are some holes in reporting and legacy-support.

(In reply to Daniel Tröder from comment #1)
> That seems to indicate the following formula:
> 53MB + 10 MiB / 100 users = 53MB + 0.01 MiB / user
> 
> So for 70,000 users that'd be 753MB MiB.
That's actually 10 MiB / 1000 users → 53 MiB + 0.001 MiB / user

So for 70,000 users that'd be 123 MiB! :)

I will conduct bigger import benchmarks tomorrow.
Comment 3 Daniel Tröder univentionstaff 2016-10-06 08:01:22 CEST
Good news everyone (the last comment was a bit delirious, but) here are the values for 10,000 users:

2016-10-06 05:09:16 INFO  Created ImportStaff: 2500
2016-10-06 05:09:17 INFO  Created ImportStudent: 2500
2016-10-06 05:09:17 INFO  Created ImportTeacher: 2500
2016-10-06 05:09:18 INFO  Created ImportTeachersAndStaff: 2500

Line #    Mem usage    Increment   Line Contents
================================================
 95   53.242 MiB   0.000 MiB  @profile(precision=3)
 96                           def import_users(self):
 98   54.754 MiB   1.512 MiB  make_user_importer(self.dry_run)
 99  117.438 MiB  62.684 MiB  user_import.read_input()
100  119.582 MiB   2.145 MiB  user_import.detect_users_to_delete()
101  119.586 MiB   0.004 MiB  user_import.delete_users(users_to_delete)
102  142.391 MiB  22.805 MiB  user_import.create_and_modify_users(imported_users)

real    280m51.149s
user    186m32.012s
sys     1m54.024s

→ 53 MiB + 0.009 MiB (9 KB) / user → 70,000 user → 683 MB.


Regarding execution time: it is IO-bound. The LDAP server is writing a lot, probably doing a lot of fsync. Is there a way to make UDM commit changes in bulk (like x users at once) to LDAP?
Comment 4 Sönke Schwardt-Krummrich univentionstaff 2016-10-06 10:47:45 CEST
(In reply to Daniel Tröder from comment #3)
> Good news everyone (the last comment was a bit delirious, but) here are the
> values for 10,000 users:
> 
> 2016-10-06 05:09:16 INFO  Created ImportStaff: 2500
> 2016-10-06 05:09:17 INFO  Created ImportStudent: 2500
> 2016-10-06 05:09:17 INFO  Created ImportTeacher: 2500
> 2016-10-06 05:09:18 INFO  Created ImportTeachersAndStaff: 2500
[…] 
> real    280m51.149s
> user    186m32.012s
> sys     1m54.024s

→ (280*60+51)/10000.0 → 1.6851 seconds per user

This is a good value. The larger the LDAP groups (e.g. "Domain Users $OU") become, the longer the creation of a new user takes.

4.7 hours for 10000 users is ok.

> Regarding execution time: it is IO-bound. The LDAP server is writing a lot,
> probably doing a lot of fsync. Is there a way to make UDM commit changes in
> bulk (like x users at once) to LDAP?

I don't think so. If the initial import is done, it is theoretically possible to disable the LDAP sync (writing LDAP data immediately to disk to have always consistent data in case of a crash). This would speed up the import extremely but increases the possibility of data loss.

I'm ok with the current values (time and memory footprint).
Comment 5 Daniel Tröder univentionstaff 2016-10-07 14:34:19 CEST
I just imported 50,000 users into 60 schools on a singleserver S4 with 4 cores and 4 GB memory, where I deactivated HDD sync (unsafe!) and OpenLDAP-MDB sync (unsafe!).


time for ((a=1; a<61; a++)); do /usr/share/ucs-school-import/scripts/create_ou "Schule$a"; done
real	11m46.168s
user	4m57.212s
sys	0m29.736s


time usr/share/ucs-school-import/scripts/ucs-school-testuser-import -v --teachers 5000 --students 40000 --staff 2000 --staffteachers 3000 --classes 2000 $(for ((a=1; a<61; a++)); do echo -n "Schule$a "; done)


01:08:12 INFO  ------ Starting to read users from input data... ------
01:16:15 INFO  ------ Read 50000 users from input data. ------
14:04:21 INFO  Created ImportStaff: 2000
14:04:21 INFO  Created ImportStaff: 2000
14:04:21 INFO  Created ImportStudent: 40000
14:04:32 INFO  Created ImportTeacher: 5000
14:04:34 INFO  Created ImportTeachersAndStaff: 3000

14:05:09 INFO  Line #  Mem usage    Increment  Line Contents
14:05:09 INFO  ===============================================
14:05:09 INFO   95   47.852 MiB    0.000 MiB   @profile(precision=3)
14:05:09 INFO   96                             def import_users(self):
14:05:09 INFO   98   49.355 MiB    1.504 MiB      make_user_importer()
14:05:09 INFO   99  365.988 MiB  316.633 MiB      read_input()
14:05:09 INFO  100  370.844 MiB    4.855 MiB      detect_users_to_delete()
14:05:09 INFO  101  370.844 MiB    0.000 MiB      delete_users(users_to_delete)
14:05:09 INFO  102  479.125 MiB  108.281 MiB      create_and_modify_users(imported_users)
14:05:09 INFO  104  479.234 MiB    0.109 MiB      log_stats()
14:05:09 INFO  112  479.926 MiB    0.691 MiB      result_exporter.dump()

real    777m15.944s
user    687m39.200s
sys     10m59.848s

432MB  / 50000 = 0.0086 MB / user
777m15 / 50000 = 0.93s / user
Comment 6 Daniel Tröder univentionstaff 2016-10-07 14:35:34 CEST
we should consider switching of sync for MDB on jenkins during our import tests. With their low IO it'd make a big difference.

ucr set ldap/database/mdb/envflags=nosync
Comment 7 Sönke Schwardt-Krummrich univentionstaff 2016-10-07 15:22:15 CEST
(In reply to Daniel Tröder from comment #6)
> we should consider switching of sync for MDB on jenkins during our import
> tests. With their low IO it'd make a big difference.
> 
> ucr set ldap/database/mdb/envflags=nosync

This is already the case.

ucs-4.1-3/test/ucs-ec2-tools/examples/jenkins$ grep nosync \
autotest-200-ucsschool-singleserver-s3.cfg 
 ucr set ldap/database/mdb/envflags=nosync
ucs-4.1-3/test/ucs-ec2-tools/examples/jenkins$
Comment 8 Daniel Tröder univentionstaff 2017-02-08 17:59:57 CET
r76568:
* lower memory footprint of import:
  - avoid collecting ImportUser objects (collects dict-representations
    for A and M and only rUID+sUID for D)
  - release created objects as early as possible
  - fix class attribute usage
* API change: UcsSchoolImportError.count -> UcsSchoolImportError.entry_count
* log import progress
* sort password and summary CSV output by input order
* adapt password and summary CSV output code to memory-footprint-changes 

Package: ucs-school-import
Version: 14.0.16-42.333.201702081756
Branch: ucs_4.1-0
Scope: ucs-school-4.1r2
Comment 9 Sönke Schwardt-Krummrich univentionstaff 2017-04-05 22:56:49 CEST
OK: code change
OK: functional test
OK: jenkins test (via U@S 4.2)
Comment 10 Sönke Schwardt-Krummrich univentionstaff 2017-04-10 11:50:24 CEST
UCS@school 4.1 R2 v11 has been released.

http://docs.software-univention.de/changelog-ucsschool-4.1R2v11-de.html

If this error occurs again, please clone this bug.
Comment 11 Sönke Schwardt-Krummrich univentionstaff 2017-04-11 17:51:21 CEST
UCS@school 4.1 R2 v11 has been released.

http://docs.software-univention.de/changelog-ucsschool-4.1R2v11-de.html

If this error occurs again, please clone this bug.