Reduce memory usage when searching for commits and issues#159
Open
marcuscaisey wants to merge 1 commit intogharlan:mainfrom
Open
Reduce memory usage when searching for commits and issues#159marcuscaisey wants to merge 1 commit intogharlan:mainfrom
marcuscaisey wants to merge 1 commit intogharlan:mainfrom
Conversation
49fd9d9 to
68f0e61
Compare
68f0e61 to
6e4e956
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Problem
When searching for commits or issues in a large repository, the
ghscript filter can exhaust all of its available memory and crash.For example, the query
gh neovim/neovim *72cf89bce8(specific commit chosen because it's the initial one) results in the following output in the debugger:I wrote a small script which performs the above query:
and profiled it using https://github.com/arnaud-lb/php-memory-profiler:
The resulting profile surfaced

json_decodeas the biggest offender:The issue is that in Workflow::requestCache, we store all of the responses from the API in an array:
alfred-github-workflow/workflow.php
Line 238 in bb79902
So for a large repository with lots of commits, the size of this array can outgrow the default memory limit of 128MB.
Solution
$transformItemparameter to Workflow::requestCache and Workflow::requestApi. When provided,$transformItemis called to transform each item returned from the API into another form.$transformItem. By doing so, we can throw away the large response object that we were previously storing in$responsesand only store the data that we need.After these changes, the
ghscript filter no longer crashes on the inputgh neovim/neovim *72cf89bce8.Effectiveness
To understand the effectiveness of this solution, i've written a slightly modified version of the above test script which sets the memory limit to 1GB so that it won't crash:
and measured the memory usage using
Before
After
Conclusion
So the max resident set size has decreased from 529MB to 70MB. Or a decrease of 87%.