8000 Memory leak in development · Issue #27273 · rails/rails · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak in development #27273

Closed
printercu opened this issue Dec 5, 2016 · 17 comments
Closed

Memory leak in development #27273

printercu opened this issue Dec 5, 2016 · 17 comments

Comments

@printercu
Copy link
Contributor
printercu commented Dec 5, 2016

Steps to reproduce

Add lines to ApplicationController:

  after_action do
    logger.info 'Leaked: ' +
      view_paths.first.instance_variable_get(:@cache).instance_variable_get(:@data).size.to_s
  end

Refresh page few times. See the number is increasing.

It's related to #20384. Digests are cleared, but not cache objects. I haven't found easy way to clear all caches, as every controller seems to have its own.

Expected behavior

Cache size should not be incremented infinitely.

Actual behavior

It's incremented infinitely.

System configuration

Rails version: 5.0.0.1

Ruby version: 2.3

@rafaelfranca
Copy link
Member

r? @kaspth

@kaspth
Copy link
Contributor
kaspth commented Dec 6, 2016

@printercu hey, thanks for the issue!

I'm curious how you see this being related to #20384, as you say, the digests are cleared correctly?

Is it possible for you to try git bisecting and finding the initial commit that contained this bug?

@printercu
Copy link
Contributor Author

@kaspth following #27271 (comment)

Digests are cleared, but not the cache stores in Resolver#cache (https://github.com/rails/rails/blob/master/actionview/lib/action_view/template/resolver.rb#L63).
The key arg there is the digest. As it refreshes for every template on every request, it misses the cache. But @data is never cleared.

The main point is that there are individual cache for every PathSet. And I don't really know, how cache can be made simply in a per-request way.
Here are the ways I see:

  • use self-cleared lru object for cache in dev.
  • track all cache instances, and clear them all together.
  • use hooks, which is very similar to prev.
  • involve request object, to keep cache within it (don't like the idea).
  • ???

@matthewd
Copy link
Member
matthewd commented Dec 6, 2016

The key arg there is the digest. As it refreshes for every template on every request, it misses the cache.

Do we expect that? 😕

Per #27271 etc, we intend to flush the cached digests on each request. But if the file(s) haven't changed, shouldn't the new digest match the old one? Maybe I'm just confused again.


For both of these caches, having them global across threads, yet cleared per request/execution, makes me feel weird. Perhaps the "right" thing to do would be for the executor layer to supply a thread-local cache store that all the PathSet instances would use? (Or still a global one when the cache is permanent, of course.)

@printercu
Copy link
Contributor Author
printercu commented Dec 7, 2016

Agree with you. Don't think we should recompile templates until they are changed.

Beyond all the digest, their caches and per-request expiries, what app must do?

  1. Provide content caching (cache helper).
  2. Expire content cache when template or any its dependency is updated.
  3. Expire content cache in development (Why? Is previous not enough?).
  4. Don't recompile templates when cache_template_loading is true.
  5. Recompile templates when cache_template_loading is false AND template file is updated.
  6. Template must be compiled for each different set of details (locale, locals, variants, ...)

Please adjust the list, if it's missing something.

According to it 4 & 5 are held with template-file's mtime. For now PerExecutionDigestCacheExpiry clears cache, which is based on template details (which are not digests themselves). This data is not related to caching, and I think should remain untouched. Maybe it was different at some point in the past.

Content-cache expiration is held by hashing template content, I think app should expire this cache instead. Or just expire all template cache on every request. I think this can be done much easier.

This are not my final thoughts, but I don't have time to go deeper now. This is what I came to at the moment.

Here are some possibly related commits:

  • 56ac6e4 - Is ThreadSafe::Cache the same as Concurrent::Map.

I've commited some of my changes to #27292. I should be easier to analyze the code.

@printercu
Copy link
Contributor Author

@kaspth you have closed #27292 after 3 minutes i'd opened it. Don't think it's right decision, you even had no time to read it. But I had found a lot of time to understand the smelling code and prepare this commits.

@kaspth
Copy link
Contributor
kaspth commented Dec 7, 2016

@printercu it doesn't take that long to see it was a pull request that did a lot of things we, on the Rails team, don't want. Here's a more elaborate explanation on why we don't accept cosmetic changes: #13771 (comment).

Appreciate the effort, but we need a PR that's more tuned to the bug at hand.

@printercu
Copy link
Contributor Author

uh. Not all ruby modules are intended for inclusion. I think not so much usecases for include Rails or include GC. Modules are for scoping/incapsulation, classes - for creating instances.

What about eliminating dead code, is it also cosmetic change? What about moving unrelated constant to the scope it's related to? I think you just haven't give any effort to read the commit carefully.

@printercu
Copy link
Contributor Author
printercu commented Dec 7, 2016

One more commit: #25041
After this one, keys for templates cache are Concurent::Maps, which contain all digests for the requests. And they are not GC'd with all the contents. And don't think that hash method on them is as fast as getting object_id.

@printercu
Copy link
Contributor Author
printercu commented Dec 7, 2016

Here are are the lines to understand:

  1. @details_keys[details] ||= Concurrent::Map.new
  2. [details, DetailsKey.get(details)]
  3. def cache(key, name, prefix, partial, locals)

DetailsKey.get(details) returns Map (1). Called from (2), and passed as key to (3).

@printercu
Copy link
Contributor Author

Here is another one, which coupled digest cache with details key: #23756

What if revert 13c4cc3, and return object_id as hash? So there would be no collisions, and digest cache can be moved back to Digestor.

/ping @tenderlove

@tenderlove
Copy link
Member

I think long term we need to make these caches related. I don't think it makes sense to clear one cache and not the other. For now, I think we can just clear both caches at the same time (in development).

@printercu
Copy link
Contributor Author

What caches do you mean? I think there is no need to clear DetailsKey cache at all - it just maps set of format, locale and others to some fixed value.

@printercu
Copy link
Contributor Author

@rafaelfranca here is #27296 fixing this issue. Just resolved conflicts there, I'll rebase it on master if everything is fine.

@duyleekun
Copy link

Hi @printercu

I forgot to enable cache_template_loading in production and tons of ActionView::Template got leaked in the memory (heapdump attached)

Is this an intended behavior or I should file another proper issue regarding this because my issue might be a dup of this.

2018-10-17T17:30:54+07:00-heap.dump.zip

@printercu
Copy link
Contributor Author

Yes, I think this is the same issue. #27296 still wasn't merged.

@byroot
Copy link
Member
byroot commented Dec 15, 2024

I'm not fully sure I understand where the original issue came from, but this variable no longer exist, and looking in some app in deveopment mode, ObjectSpace.each_object(ActionView::Template).count doesn't grow.

So I'm going to assume this was fixed at some point.

@byroot byroot closed this as completed Dec 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants
0