No announcement yet.

SVN Timing Out on Initial Commit

  • Filter
  • Time
  • Show
Clear All
new posts

  • SVN Timing Out on Initial Commit

    I've been trying to get SVN working on a Raspberry Pi 3B+ w/ a NAS attached. I've got it mostly working (I'm able to commit small projects/tests). However, I'm trying to commit a project that's roughly 2.4GB in size (it takes roughly 2.5 hours to run the import).

    On the initial commit/import Tortoise successfully goes through all of the files and then says "Committing transaction..." for roughly 10 minutes and then ends with errors. I've looked at permissions, I've deleted/removed the repository on the NAS and re-created the project with permissions and retried with slightly different settings many times... and I've edited several timeout settings (as seen below). I'm pretty baffled as to why this keeps failing. (FYI - Most of my configuration edits are from other posts I've found with similar (but not the same) issues I found on the web).

    Below are the errors, configurations, and system information on the matter. Could someone help me figure this out? Thanks in advance for your assistance.

    Tortoise SVN Committ error:

    [CODE]Connection timed out
    Additional errors:
    Unexpected server error 500 'Internal Server Error' on '/svn/LinkedBound/!svn/txn/0-0'[/CODE]

    Apache2 error.log:

    [CODE][Sun Jul 12 12:38:18.450162 2020] [dav:error] [pid 901:tid 1905226784] [client] Could not DELETE /svn/LinkedBound/!svn/txn/0-0. [500, #0]
    [Sun Jul 12 12:38:18.454529 2020] [dav:error] [pid 901:tid 1905226784] [client] could not abort transaction. [500, #2]
    [Sun Jul 12 12:38:18.454591 2020] [dav:error] [pid 901:tid 1905226784] [client] Transaction '0-0' cleanup failed [500, #2]
    [Sun Jul 12 12:38:18.454629 2020] [dav:error] [pid 901:tid 1905226784] [client] Can't remove '/home/pi/myNAS/Projects/GameDev/repos/Unity/LinkedBound/db/transactions/0-0.txn/node._7f.0' [500, #2]
    [Sun Jul 12 12:38:18.454666 2020] [dav:error] [pid 901:tid 1905226784] [client] Can't remove file '/home/pi/myNAS/Projects/GameDev/repos/Unity/LinkedBound/db/transactions/0-0.txn/node._7f.0': No such file or directory [500, #2]
    [Sun Jul 12 12:38:18.612336 2020] [dav:error] [pid 903:tid 1894757408] [client] Could not MERGE resource "/svn/LinkedBound/!svn/txn/0-0" into "/svn/LinkedBound". [500, #0]
    [Sun Jul 12 12:38:18.612503 2020] [dav:error] [pid 903:tid 1894757408] [client] An error occurred while committing the transaction. [500, #160014]
    [Sun Jul 12 12:38:18.612553 2020] [dav:error] [pid 903:tid 1894757408] [client] Reference to non-existent node '_fhr.0.t0-0' in filesystem '/home/pi/myNAS/Projects/GameDev/repos/Unity/LinkedBound/db' [500, #160014][/CODE]

    Tortoise SVN Config (%appdata%\subversion\config) alterations:

    [CODE]http-timeout = 3600000[/CODE]

    ~/.subversion/servers alterations:

    http-timeout = 3600000[/CODE]

    Permissions set:

    [CODE]sudo chown -R www-data:www-data /home/pi/myNAS/Projects/GameDev/repos/ && sudo chmod a+x /etc/apache2[/CODE]

    dav_svn.conf settings:

    [CODE]KeepAlive On
    MaxKeepAliveRequests 0
    # Set to 10h.
    Timeout 36000
    SVNCompressionLevel 5
    SVNInMemoryCacheSize 16384
    SVNCacheTextDeltas On
    SVNCacheFullTexts On
    SVNAllowBulkUpdates Prefer
    <Location /svn>
    DAV svn
    SVNParentPath /home/pi/myNAS/Projects/GameDev/repos/Unity
    SVNListParentPath On
    AuthType Basic
    AuthName "Subversion Repo"
    AuthUserFile /etc/apache2/dav_svn.passwd
    Require valid-user
    # Allow large request
    LimitXMLRequestBody 0

    WD My Cloud EX4100 /etc/exports setting:

    [CODE]"/nfs/Projects" *(rw,no_root_squash,sync,no_wdelay,insecure,no_sub tree_check,crossmnt)[/CODE]

    NAS Information:

    [CODE]WD My Cloud EX4100
    Firmware: 2.31.204[/CODE]

    Tortoise SVN Version

    [CODE]1.14 (r28864)[/CODE]

    SVN Version

    [CODE]svn, version 1.10.4 (r1850624)[/CODE]

    SVN OS

    [CODE]Linux 4.19.118-v7+ #1311 SMP Mon Apr 27 14:21:24 BST 2020 armv7l
    GNU/Linux Distributor ID: Raspbian
    Description: Raspbian GNU/Linux 10 (buster)
    Release: 10
    Codename: buster[/CODE]

    My PC:

    [CODE]Microsoft Windows [Version 10.0.18363.900][/CODE]

  • #2
    The armv7I is a 32-bit processor. It's unclear how it's going to handle 2.4GB of data (could exceed MAXINT).

    Do small commits work? Is it only this really big one failing?


    • #3
      First, let me just say [USER="125501"]DougR[/USER] THANK YOU for responding... I've been trying to get someone to suggest something... anything! In other places over the last two weeks... You're the first to respond.

      To answer your question, small commits do work, it is only the big one failing.

      That's an interesting thought that it could be a processor issue... Is there a way to force SVN to do things in smaller chunks so the processor (and I'd imagine the 1GB of RAM) might not be overwhelmed?

      Or should I look into setting up a different way? I was really hoping to get the Raspberry Pi 3B+ to work for this...


      • #4
        Just to validate my assumption I'd suggest the following:
        - Create a scratch repository
        - Check it out
        - Create a 1024-byte (1KiB) file. Add it, update.
        - Create a 1024*1024-byte (1MiB) file. ...
        - Create a 1024*1024*1024-byte (1GiB) file. ...
        - Create a 2 GiB file. ...
        - Create a 3 GiB file. ...

        If all goes well until the 2GiB (or 3GiB) file then something in your Apache/SVN compilation is using MAXINT when it should probably be using LONG-LONG (a required 64-bit data type). Tracking down stuff like this is very time-consuming. Fixing it might be more-so (transitive closure...).

        I've never used a Rasberry Pi so not sure what other alternatives you've got...


        • #5
          [USER="125501"]DougR[/USER] Thanks much for your assistance.

          In running something close to what you suggest I ran tests committing repos of various sizes (50K, 1.71MB, 88.7MB, and 150MB)...

          In running these tests I discovered that the commit failed when I got up to 150MB and that it only committed roughly 50MB before failing.

          I ended up just scrapping SVN and going w/ GIT on the RPi 3B+, it did what I needed it to do right away.