recover file after crash

classic Classic list List threaded Threaded
15 messages Options
Reply | Threaded
Open this post in threaded view
|

recover file after crash

drain
Instead of Emacs telling me that I can recover auto-save data after a
crash, then leaving me to manually enter the command, I'd prefer Emacs to
prompt me with a yes-or-n-p to do so itself.
Reply | Threaded
Open this post in threaded view
|

RE: recover file after crash

Drew Adams
> Instead of Emacs telling me that I can recover auto-save data after a
> crash, then leaving me to manually enter the command, I'd
> prefer Emacs to prompt me with a yes-or-n-p to do so itself.

Please use `M-x report-emacs-bug' if you want to submit an enhancement request.


Reply | Threaded
Open this post in threaded view
|

RE: recover file after crash

drain
I meant to ask if that were already an option.

Related: how can I get Emacs NOT to prompt me when I want to open large (>10 MB) files?
Reply | Threaded
Open this post in threaded view
|

Re: recover file after crash

Valentin Baciu-2
Hi drain,

Maybe this will help, even though I never tried it myself because I like opening large and hairy files :)

Type the following: C-h v large-<TAB>

Then the description for the variable will show up like this:

large-file-warning-threshold is a variable defined in `files.el'.
Its value is 10000000

Documentation:
Maximum size of file above which a confirmation is requested.
When nil, never request confirmation.

You can customize this variable.

This variable was introduced, or its default value was changed, in
version 22.1 of Emacs.

[back]



On Fri, Jan 25, 2013 at 8:23 PM, drain <[hidden email]> wrote:
I meant to ask if that were already an option.

Related: how can I get Emacs NOT to prompt me when I want to open large (>10
MB) files?



--
View this message in context: http://emacs.1067599.n5.nabble.com/recover-file-after-crash-tp276397p276478.html
Sent from the Emacs - Help mailing list archive at Nabble.com.


Reply | Threaded
Open this post in threaded view
|

Re: recover file after crash

drain
Just customized that variable to nil.

Do you guys experience lag when you open these large files? For me, the
deeper I get into a nested heading, the less lag there is, eventually
lag-free. But when I am at the overview, it starts to get pretty slow
around 10 MB.
Reply | Threaded
Open this post in threaded view
|

Re: recover file after crash

Eli Zaretskii
> Date: Fri, 25 Jan 2013 11:12:43 -0800 (PST)
> From: drain <[hidden email]>
>
> Do you guys experience lag when you open these large files? For me, the
> deeper I get into a nested heading, the less lag there is, eventually
> lag-free. But when I am at the overview, it starts to get pretty slow
> around 10 MB.

Because what is one line on display may be many lines of text in the
buffer, and redisplay needs to traverse them all.

Reply | Threaded
Open this post in threaded view
|

Re: recover file after crash

Valentin Baciu-2
In reply to this post by drain
Hello again,

In my experience, large files with few line ends (like minified JavaScript files) tend to lag more. Sometimes I just use vim for these kind of files or, depending on what I want to do, grep or sed in order to avoid opening it in Emacs. But that should be a rare case rather than the rule.

I like to think of Emacs as a code editor (mostly) which works great for development, but not necessarily for debugging.

Byte compiling your major modes may speed up processing, but I don't think of it as a general performance solution. This "tip" should apply only for custom installs because the default/built-in ones are already compiled.

In case you can share the file that you have trouble with I will gladly try it on my system.



On Fri, Jan 25, 2013 at 9:12 PM, drain <[hidden email]> wrote:
Just customized that variable to nil.

Do you guys experience lag when you open these large files? For me, the
deeper I get into a nested heading, the less lag there is, eventually
lag-free. But when I am at the overview, it starts to get pretty slow
around 10 MB.



--
View this message in context: http://emacs.1067599.n5.nabble.com/recover-file-after-crash-tp276397p276485.html
Sent from the Emacs - Help mailing list archive at Nabble.com.


Reply | Threaded
Open this post in threaded view
|

Re: recover file after crash

drain
In reply to this post by Eli Zaretskii
Eli Zaretskii wrote
Because what is one line on display may be many lines of text in the
buffer, and redisplay needs to traverse them all.
Makes sense. I'm glad to know there is a clear technical reason for it, and
that it is not symptomatic of a poorly optimized installation.

In any case, I sent Valentin a large file that tends to lag (didn't want to
impose a large attachment on the mailing list).
Reply | Threaded
Open this post in threaded view
|

Re: recover file after crash

Eli Zaretskii
> Date: Fri, 25 Jan 2013 12:31:57 -0800 (PST)
> From: drain <[hidden email]>
>
> In any case, I sent Valentin a large file that tends to lag

If it has no or few empty lines, adding empty lines (e.g., before
top-level headings) might dramatically improve performance in very
large buffers.

Reply | Threaded
Open this post in threaded view
|

RE: recover file after crash

Ludwig, Mark-2
In reply to this post by drain
> From:  drain
> Sent: Friday, January 25, 2013 2:32 PM
> To: [hidden email]
> Subject: Re: recover file after crash
>
> Eli Zaretskii wrote
> > Because what is one line on display may be many lines of text in the
> > buffer, and redisplay needs to traverse them all.
>
> Makes sense. I'm glad to know there is a clear technical reason for it, and
> that it is not symptomatic of a poorly optimized installation.
>
> In any case, I sent Valentin a large file that tends to lag (didn't want to
> impose a large attachment on the mailing list).

I think I thoroughly documented this in Bug # 9589
(http://debbugs.gnu.org/cgi/bugreport.cgi?bug=9589) for the
developers to consider, which see.  (No attachment is
required to reproduce the behavior.)  It has been merged
with bugs 3219 and 4123, both of which also document how to
show the problem without a specific attachment.

You might also be interested in some of the experimental
results I documented in bug 9589 about which commands
respond consistently (i.e., quickly with long lines) and
which are sensitive to the line length and/or buffer
position.

Good luck,
Mark


Reply | Threaded
Open this post in threaded view
|

Re: recover file after crash

drain
In reply to this post by Eli Zaretskii
Eli Zaretskii wrote
> Date: Fri, 25 Jan 2013 12:31:57 -0800 (PST)
> From: drain <[hidden email]>
>
> In any case, I sent Valentin a large file that tends to lag

If it has no or few empty lines, adding empty lines (e.g., before
top-level headings) might dramatically improve performance in very
large buffers.
I have to add a couple spaces to the very end of the text of the last
nested headline in order for the spaces to separate each of the top level
headlines. Otherwise, when I reopen the file, these spaces are gone.

In any case, after doing so, implementing your suggestion appears to have
improved performance, especially when sifting through top level headlines
that are partially or completely opened, and also especially when running
point rapidly up a sequence of top level headlines.
Reply | Threaded
Open this post in threaded view
|

Re: recover file after crash

Eli Zaretskii
In reply to this post by Ludwig, Mark-2
> From: "Ludwig, Mark" <[hidden email]>
> Date: Fri, 25 Jan 2013 20:56:47 +0000
>
> I think I thoroughly documented this in Bug # 9589
> (http://debbugs.gnu.org/cgi/bugreport.cgi?bug=9589) for the
> developers to consider, which see.

Assuming the OP has such long lines, yes.  But that would be unusual
in a file that is being read in any kind of overview mode, because
those normally are for human consumption, so long lines are unlikely
to appear in them.

Reply | Threaded
Open this post in threaded view
|

RE: recover file after crash

Ludwig, Mark-2
> From: Eli Zaretskii
> Sent: Saturday, January 26, 2013 4:40 AM
>
> > From: "Ludwig, Mark" <[hidden email]>
> > Date: Fri, 25 Jan 2013 20:56:47 +0000
> >
> > I think I thoroughly documented this in Bug # 9589
> > (http://debbugs.gnu.org/cgi/bugreport.cgi?bug=9589) for the
> > developers to consider, which see.
>
> Assuming the OP has such long lines, yes.  But that would be unusual
> in a file that is being read in any kind of overview mode, because
> those normally are for human consumption, so long lines are unlikely
> to appear in them.

Agreed, this is not an every-day occurrence.

In my case, it is because log files I receive from customers sometimes
have a huge number of NUL bytes preceding the readable content.  This
leading, extremely-long line gets in the way of reasonable
responsiveness.

I also confess that I have not been keeping up with the current trends
in Emacs development.  Back when I started using EMACS [sic], it was
an excellent binary editor.  Apparently this is not a current
requirement, because it clearly is no longer useful for same.  Side
question: is there a GNU tool designed for editing binary files?

Cheers,
Mark

Reply | Threaded
Open this post in threaded view
|

Re: recover file after crash

Eli Zaretskii
> From: "Ludwig, Mark" <[hidden email]>
> Date: Sat, 26 Jan 2013 21:47:50 +0000
>
> Side question: is there a GNU tool designed for editing binary
> files?

Hexl mode in Emacs?

Reply | Threaded
Open this post in threaded view
|

Re: recover file after crash

David Combs
In reply to this post by Eli Zaretskii
In article <[hidden email]>,
Ludwig, Mark <[hidden email]> wrote:

>> From: Eli Zaretskii
>> Sent: Saturday, January 26, 2013 4:40 AM
>>
>> > From: "Ludwig, Mark" <[hidden email]>
>> > Date: Fri, 25 Jan 2013 20:56:47 +0000
>> >
>> > I think I thoroughly documented this in Bug # 9589
>> > (http://debbugs.gnu.org/cgi/bugreport.cgi?bug=9589) for the
>> > developers to consider, which see.
>>
>> Assuming the OP has such long lines, yes.  But that would be unusual
>> in a file that is being read in any kind of overview mode, because
>> those normally are for human consumption, so long lines are unlikely
>> to appear in them.
>
>Agreed, this is not an every-day occurrence.
>
>In my case, it is because log files I receive from customers sometimes
>have a huge number of NUL bytes preceding the readable content.  This
>leading, extremely-long line gets in the way of reasonable
>responsiveness.
>
>I also confess that I have not been keeping up with the current trends
>in Emacs development.  Back when I started using EMACS [sic], it was
>an excellent binary editor.  Apparently this is not a current
>requirement, because it clearly is no longer useful for same.  Side
>question: is there a GNU tool designed for editing binary files?
>
>Cheers,
>Mark
>

About all those nulls that make lines so long, couldn't you run
the file through sed or some super-short c-program or even perl
first, and delete (ie ignore) all those nulls then?

David