≡

wincent.dev

  • Products
  • Blog
  • Wiki
  • Issues
You are viewing an historical archive of past issues. Please report new issues to the appropriate project issue tracker on GitHub.
Home » Issues » Bug #1527

Bug #1527: Can't access any files from a specific directory

Kind bug
Product Command-T
When 2010-03-30T10:51:08Z
Status closed
Reporter anonymous
Tags no tags

Description

I saw this vim script and watched the screencast and seems like the missing I was always looking for. :)

But when I try to use it on my computer (MacOS 10.6.2, MacVim) that doesn't work.

I have a hierachy like this :

$HOME/
# find . -type d -iname "[a-zA-Z]*" -depth 1
./Applications
./Desktop
./Documents
./Downloads
./FontExplorer X
./Library
./Movies
./Music
./Pictures
./Public
./Source
./bin
./binaries
./compile
./glibc
./gtk
./macports_local
./metatrader
./pkg

With that much (approx. in it)

# ls -alR | wc -l
  489896
# ls compile 
HandBrake       aiccu  gnupg-1.4.10  tuntaposx
ffmpeg-mt       gpac  
.....

and for example there is this file : compile/ffmpeg-mt/ffplay.c

I load MacVim and if I issue :pwd I get /Volumes/dat/myusername

then :CommandT

and in the prompt I type cfffpl but nothing comes from the compile dir so I try compffmp and again nothing shows up. I tried many many others pattern but even compile won't list me the directories inside "compile"

Is that bug or am I doing something wrong ?

Comments

  1. Greg Hurrell 2010-03-30T10:58:51Z

    Take a look at the documentation and in particular the g:CommandTMaxFiles setting.

    The default setting is to stop scanning after seeing 10,000 files, so in a case like the directory you're talking about only a tiny percentage of it is actually getting scanned. The idea is to set the limit to something that is useful in whatever is the largest project that you usually work on; and by keeping the default low (10,000) we prevent a user from inadvertently causing his or her hard drive to churn for 15 or 30 seconds because he/she accidentally hit <Leader>t when the :pwd was / or something similar.

    So depending on how beefy your machine is you can change that setting by putting something like this:

    let g:CommandTMaxFiles=100000

    Or even this:

    let g:CommandTMaxFiles=500000

    In your ~/.vimrc.

    That might fix the issue.

    The other thing to bear in mind is that given that Command-T will look for matching characters anywhere within the path name, when you are scanning 500,000 files with a search string like "cfffpl" you are probably going to match against hundreds and hundreds of files, and so in order to narrow down the number of matches as quickly as possible you'll want to supply more letters from the leading path components.

  2. anonymous 2010-03-30T11:37:31Z

    Ah I should have read the documentation sorry.

    But that doesn't solve my issue. I tried a really high value after trying more sane value (99 999 999 999) that get bound to 1 215 752 191 which I think should be large enough.

    # time find . -type f | wc -l                   (~)
      261169
    find . -type f  0,79s user 11,77s system 17% cpu 1:12,78 total
    wc -l  0,04s user 0,03s system 0% cpu 1:12,77 total
    # find . -type f -depth 15 |wc -l               (~)
       18293

    then :

    :CommandTFlush
    :CommandT
    >>> compileffmp

    the directory and the file still didn't show up. :(

    But when I :

    :lcd ~/compile
    :CommandT
    >>> ffmffpl

    I can get to my file.

    Also when changing the Maxfiles :CommandT takes a little bit longer but take at most 10seconds.

    Is it possible to 'dump' the searched/stored files for debug purpose ? Or am I again missing an option ?

  3. Greg Hurrell 2010-03-30T12:21:04Z

    The delay of about 10 seconds is normal when scanning such a large directory tree (as you can see from how long it takes find to scan it), but it should only be slow the first time you bring up Command-T. The second time you bring it up it should just use the already-scanned list of files rather than rescanning.

    Perhaps the number of matches is simply too large for them to all fit in the match window? ie. you type "compileffmp" and you get a bunch of results, filling up the entire window, but not the one you want?

    As for being able to "dump" the scanned files, there is no built-in option for this, but you could just insert one by editing the file ~/.vim/ruby/command-t/scanner.rb. It will probably slow things down quite a lot though... eg. around line 78, in the add_paths_for_directory method you could add a puts to dump the paths as they are scanned:

        def add_paths_for_directory dir, accumulator
          Dir.foreach(dir) do |entry|
            next if ['.', '..'].include?(entry)
            path = File.join(dir, entry)
            unless path_excluded?(entry)
              if File.file?(path)
                @files += 1
                raise FileLimitExceeded if @files > @max_files
                accumulator << path[@prefix_len + 1..-1]
                puts path[@prefix_len + 1..-1] # <<------ add this line to print out paths during scanning
              elsif File.directory?(path)
                next if (entry.match(/\A\./) && !@scan_dot_directories)
                @depth += 1
                raise DepthLimitExceeded if @depth > @max_depth
                add_paths_for_directory path, accumulator
                @depth -= 1
              end
            end
          end
        rescue Errno::EACCES
          # skip over directories for which we don't have access
        end

    As you can see from that method, there are a few ways in which a path won't get included:

    • it is skipped if path_excluded returns true (ie. if your VIM 'wildignore' contains a pattern which matches the file or directory)
    • it is skipped if it is neither a file nor a directory (symbolic links to files or directories seem to be fine)
    • it is skipped once the total number of scanned files exceeds the limit you set using g:CommandTMaxFiles
    • it is skipped if you descend down too many levels (in excess of the limit defined by g:CommandTMaxDepth)
    • it is skipped if you don't have read permission

    This discussion has, however, made me notice one bug in the scanning method. Looks like if the depth limit gets hit early on in the scan, the entire scan gets aborted instead of just aborting the recursion into the subdirectory currently being scanned. So I'll fix that. Not sure if it might be responsible for your missing file problem.

  4. Greg Hurrell 2010-03-30T12:31:44Z

    Just pushed the fix for the bug mentioned in my last comment.

    Like I said, not sure if it might fix your problem, but in any case it was a bug that needed to be fixed.

  5. anonymous 2010-03-30T16:54:04Z

    In fact for me the delay was too short compared to the time taken by find. So I was wondering if command-t was so fast that he can outperform the old 'find' ;)

    The number of matches was really 'short' a dozen lines at most.

    But the bug you discovered solved my problem. :) Thanks for your time and command-t

    And as you suggested searching trough a 10K+ files is a bit slow but that's not a problem I know there are limits and I'm okay with that. Now the first call to CommandT takes a little bit longer. But I should point out that my homedir is a real mess to say the least ;)

  6. Greg Hurrell 2010-03-30T16:57:16Z

    Cool. Glad to hear that that solved the problem.

    As for the speed, I must admit that the biggest project I use it on is somewhat over 7,000 files and the performance is great. If it turns out that more people start to use it and start demanding good performance on even larger projects there is still plenty of scope for optimization in the project. I just took the "low hanging fruit" and stopped once performance for my use case became satisfactory (and the truth is, I am very very satisfied with it on that size project).

    Will mark this one as closed for now.

  7. Greg Hurrell 2010-03-30T16:57:22Z

    Status changed:

    • From: new
    • To: closed
Add a comment

Comments are now closed for this issue.

  • contact
  • legal

Menu

  • Blog
  • Wiki
  • Issues
  • Snippets