Search Results

Search found 59543 results on 2382 pages for 'solution files'.

Page 75/2382 | < Previous Page | 71 72 73 74 75 76 77 78 79 80 81 82  | Next Page >

  • Table filtering in jquery - a more elegant solution please

    - by Neil Burton
    I want to filter certain rows out of a table and am using classes to categorise the rows. The below code enables me to show and hide row data categorised as "QUO" and "CAL" (eventually there will be other categories. Can someone point me towards a more elegant solution, so I don't have to duplicate code for each category as I have below? Thanks! <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <html> <head> <title>Untitled</title> <style> </style> <script src="Javascript/jquery-1.4.2.min.js" type="text/javascript"></script> <script type="text/javascript"> $(document).ready(function () { $("#toggle_ac_cal").click(function() { var checked_status = this.checked; if (checked_status==true) { $(".ac_cal").show() } else { $(".ac_cal").hide() } }); $("#toggle_ac_quo").click(function() { var checked_status = this.checked; if (checked_status==true) { $(".ac_quo").show() } else { $(".ac_quo").hide() } }); }); </script> </head> <body> <input type="checkbox" id="toggle_ac_cal" checked="checked" />CAL<br/> <input type="checkbox" id="toggle_ac_quo" checked="checked" />QUO<br/> <table> <tbody> <tr class="ac_cal"> <td>CAL</td> <td>10 Oct</td> <td>John Barnes</td> </tr> <tr class="ac_cal"> <td>CAL</td> <td>10 Oct</td> <td>Neil Burton</td> </tr> <tr class="ac_quo"> <td>QUO</td> <td>11 Oct</td> <td>Neil Armstrong</td> </tr> </tbody> </table> </body> </html>

    Read the article

  • PERL newbie : get a proper minimal debug_mode solution

    - by Michael Mao
    Hi all: I am learning PERL in a "head-first" manner. I am absolutely a newbie in this language: I am trying to have a debug_mode switch from CLI which can be used to control how my script works, by switching certain subroutines "on and off". And below is what I've got so far: #!/usr/bin/perl -s -w # purpose : make subroutine execution optional, # which is depending on a CLI switch flag use strict; use warnings; use constant DEBUG_VERBOSE => "v"; use constant DEBUG_SUPPRESS_ERROR_MSGS => "s"; use constant DEBUG_IGNORE_VALIDATION => "i"; use constant DEBUG_SETPPING_COMPUTATION => "c"; our ($debug_mode); mainMethod(); sub mainMethod # () { if(!$debug_mode) { print "debug_mode is OFF\n"; } elsif($debug_mode) { print "debug_mode is ON\n"; } else { print "OMG!\n"; exit -1; } checkArgv(); printErrorMsg("Error_Code_123", "Parsing Error at..."); verbose(); } sub checkArgv #() { print ("Number of ARGV : ".(1 + $#ARGV)."\n"); } sub printErrorMsg # ($error_code, $error_msg, ..) { if(defined($debug_mode) && !($debug_mode =~ DEBUG_SUPPRESS_ERROR_MSGS)) { print "You can only see me if -debug_mode is NOT set". " to DEBUG_SUPPRESS_ERROR_MSGS\n"; die("terminated prematurely...\n") and exit -1; } } sub verbose # () { if(defined($debug_mode) && ($debug_mode =~ DEBUG_VERBOSE)) { print "Blah blah blah...\n"; } } So far as I can tell, at least it works...: the -debug_mode switch doesn't interfere with normal ARGV the following commandlines work: ./optional.pl ./optional.pl -debug_mode ./optional.pl -debug_mode=v ./optional.pl -debug_mode=s However, I am puzzled when multiple debug_modes are "mixed", such as: ./optional.pl -debug_mode=sv ./optional.pl -debug_mode=vs I don't understand why the above lines of code "magically works". I see both of the "DEBUG_VERBOS" and "DEBUG_SUPPRESS_ERROR_MSGS" apply to the script, which is fine in this case. However, if there are some "conflicting" debug modes, I am not sure how to set the "precedence of debue_modes"? Also, I am not certain if my approach is good enough to Perlists and I hope I am getting my feet in the right direction. One biggest problem is that I now put if statements inside most of my subroutines for controlling their behavior under different modes. Is this okay? Is there a more elegant way? I know there must be a debug module from CPAN or elsewhere, but I wanna a real minimal solution that doesn't depend on any other module than the "default" And I cannot have any control on the environment where this script will be executed... Many thanks to the suggestions in advance.

    Read the article

  • Reducing Integer Fractions Algorithm - Solution Explanation?

    - by Andrew Tomazos - Fathomling
    This is a followup to this problem: Reducing Integer Fractions Algorithm Following is a solution to the problem from a grandmaster: #include <cstdio> #include <algorithm> #include <functional> using namespace std; const int MAXN = 100100; const int MAXP = 10001000; int p[MAXP]; void init() { for (int i = 2; i < MAXP; ++i) { if (p[i] == 0) { for (int j = i; j < MAXP; j += i) { p[j] = i; } } } } void f(int n, vector<int>& a, vector<int>& x) { a.resize(n); vector<int>(MAXP, 0).swap(x); for (int i = 0; i < n; ++i) { scanf("%d", &a[i]); for (int j = a[i]; j > 1; j /= p[j]) { ++x[p[j]]; } } } void g(const vector<int>& v, vector<int> w) { for (int i: v) { for (int j = i; j > 1; j /= p[j]) { if (w[p[j]] > 0) { --w[p[j]]; i /= p[j]; } } printf("%d ", i); } puts(""); } int main() { int n, m; vector<int> a, b, x, y, z; init(); scanf("%d%d", &n, &m); f(n, a, x); f(m, b, y); printf("%d %d\n", n, m); transform(x.begin(), x.end(), y.begin(), insert_iterator<vector<int> >(z, z.end()), [](int a, int b) { return min(a, b); }); g(a, z); g(b, z); return 0; } It isn't clear to me how it works. Can anyone explain it? The equivilance is as follows: a is the numerator vector of length n b is the denominator vector of length m

    Read the article

  • java ioexception error=24 too many files open

    - by MattS
    I'm writing a genetic algorithm that needs to read/write lots of files. The fitness test for the GA is invoking a program called gradif, which takes a file as input and produces a file as output. Everything is working except when I make the population size and/or the total number of generations of the genetic algorithm too large. Then, after so many generations, I start getting this: java.io.FileNotFoundException: testfiles/GradifOut29 (Too many open files). (I get it repeatedly for many different files, the index 29 was just the one that came up first last time I ran it). It's strange because I'm not getting the error after the first or second generation, but after a significant amount of generations, which would suggest that each generation opens up more files that it doesn't close. But as far as I can tell I'm closing all of the files. The way the code is set up is the main() function is in the Population class, and the Population class contains an array of Individuals. Here's my code: Initial creation of input files (they're random access so that I could reuse the same file across multiple generations) files = new RandomAccessFile[popSize]; for(int i=0; i<popSize; i++){ files[i] = new RandomAccessFile("testfiles/GradifIn"+i, "rw"); } At the end of the entire program: for(int i=0; i<individuals.length; i++){ files[i].close(); } Inside the Individual's fitness test: FileInputStream fin = new FileInputStream("testfiles/GradifIn"+index); FileOutputStream fout = new FileOutputStream("testfiles/GradifOut"+index); Process process = Runtime.getRuntime().exec ("./gradif"); OutputStream stdin = process.getOutputStream(); InputStream stdout = process.getInputStream(); Then, later.... try{ fin.close(); fout.close(); stdin.close(); stdout.close(); process.getErrorStream().close(); }catch (IOException ioe){ ioe.printStackTrace(); } Then, afterwards, I append an 'END' to the files to make parsing them easier. FileWriter writer = new FileWriter("testfiles/GradifOut"+index, true); writer.write("END"); try{ writer.close(); }catch(IOException ioe){ ioe.printStackTrace(); } My redirection of stdin and stdout for gradif are from this answer. I tried using the try{close()}catch{} syntax to see if there was a problem with closing any of the files (there wasn't), and I got that from this answer. It should also be noted that the Individuals' fitness tests run concurrently. UPDATE: I've actually been able to narrow it down to the exec() call. In my most recent run, I first ran in to trouble at generation 733 (with a population size of 100). Why are the earlier generations fine? I don't understand why, if there's no leaking, the algorithm should be able to pass earlier generations but fail on later generations. And if there is leaking, then where is it coming from? UPDATE2: In trying to figure out what's going on here, I would like to be able to see (preferably in real-time) how many files the JVM has open at any given point. Is there an easy way to do that?

    Read the article

  • SQL SERVER – Read Only Files and SQL Server Management Studio (SSMS)

    - by pinaldave
    Just like any other Developer or DBA SQL Server Management Studio is my favorite application. Any any moment of the time I have multiple instances of the same application are open and I am working on it. Recently, I have come across a very interesting feature in SSMS related to “Read Only” files. I believe it is a little unknown feature as well so decided to write a blog about the same. First create a read only SQL file. You can make any file read by Right Click >> Properties >> Select Attribute Read Only. Now open the same file in SQL Server Management Studio. You will find that besides the file name there is a small ‘lock’ icon. This small icon indicates that the file is read only. Now let us attempt to edit the read only file. It will let us edit the file any way we want, however when we attempt to save it, it gives following pop-up value. The options in the pop-up are self explanatory and I liked it. The goal of the read only file is to prevent users to make un-intended changes. However, when a user should have complete control over the user file. User should be aware that the file is read only but if he wants to edit the file or save as a new file the choices should be present in front of it and the pop-up menu precisely captures the same. Now let us check option related to this feature in SSMS. Go to Menu >> Options >> Environment >> Documents You will find the third option which is “Allow editing of read-only files; warn when attempt to save”. In the above scenario it was already checked. Let us uncheck the same and do the same exercise which we have done earlier. I closed all the earlier window to avoid confusion. With the new option selected when I attempt to even modify the Read Only file, it gives me totally different pop up screen. It gives me an option like “Edit In-Memory”, “Make Writeable” etc. When you select “Edit In-Memory” it allows you to edit the file and later you can save as new file – just like the earlier scenario which we have discussed. . If clicked on the Make Writeable it will remove the restriction of the Read Only and file can be edited as pleased. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Indentify Codecs & Technical Information About Video Files

    - by DigitalGeekery
    Have you ever wanted to play an audio or video file but didn’t have the proper codec installed? Today we’ll show how to determine codecs, along with a host of other technical details about your media files with MediaInfo. Installation Download and install MediaInfo. You can find the download link at the bottom of the page. Note: When installing MediaInfo there is a recommended software bundle which you can opt out of by selecting Do not install option. Each recommended software choice may be different, like in this example it offers Spyware Terminator. The cool thing though is they use Open Candy which opts you out of the install. Just double check to make sure you’re not installing extra crapware. Using MediaInfo The first time you run MediaInfo it will display the Preferences window. There are various option such as language, output format, and whether or not you want MediaInfo to check for new versions. Click OK. Select a file or folder to analyze by clicking on the File or Folder icons on the left of the application window or by selecting File > Open from the menu. You can also drag and drop a file directly onto the application. MediaInfo will display details of your media file. In Basic view, you’ll see basic information. Notice in the example below the video and audio codecs, along with file size, running time of the media file, and even the application used to create the video file (Writing application).    You can switch to some of the other views by selecting View from the Menu and choosing form the dropdown list.   Sheet View will present the information a bit more clearly. You can see in the example below that the video and audio codec are listing in clearly identified columns. (AVC is often more commonly referred to H.264.)   Tree View is perhaps the most detailed. You can see from the example below the codec used for this AVI file is XviD.   Scrolling down even further you’ll see additional information like video and audio bit rates, frame rate, aspect ratio, and more.   In Basic View (and also in Sheet view) you can click to find a player for your file. In this instance with an MP4 file, it took me to the download page for Quicktime. This is by no means the only media player for this file, but if you are stuck for how to play a media file, this will forward you to a solution that works. You can do the same thing with Video codec. Click Go to the web site of this video codec to find a download.   MediaInfo is a simple but powerful tool that can be used to discover the details of a media file, or just to find a compatible codec. It works with most any video file type and is available for Windows, Mac, and Linux. Some Mac and Linux versions, however, are currently command line only. Download MediaInfo Similar Articles Productive Geek Tips How to Convert Videos to 3GP for Mobile PhonesFix for VLC Skipping and Lagging Playing High-Def Video FilesUsing VLC Player Under VistaUse Your Mac Mini as a Media Server Part 2How to Play .OGM Video Files in Windows Vista TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 2010 World Cup Schedule Boot Snooze – Reboot and then Standby or Hibernate Customize Everything Related to Dates, Times, Currency and Measurement in Windows 7 Google Earth replacement Icon (Icons we like) Build Great Charts in Excel with Chart Advisor tinysong gives a shortened URL for you to post on Twitter (or anywhere)

    Read the article

  • Customisation / overriding of the Envelop ecs files

    - by Dheeraj Kumar M
    There are few usecases where the requirement is to customise the envelop information (Interchange/Group ecs file). Such scenarios might be required to be used for only few of the customers. Hence, in addition to the default seeded envelop definitions, it also required to upload the customised definitions. Here is the steps for achieving the same. 1. Create only the Interchange ecs and save 2. Create only the group ecs and save 3. Use the same in B2B 1. Create only the Interchange ecs and save :       Open the document editor and select the required version and doctype. During creating new ecs, ensure to select the checkbox for insert envelop.       Once created, delete the group and transactionset nodes and retain only the Interchange ecs nodes, including both header and trailer. Save this file. 2. Create only the group ecs and save       After creating the ecs file as mentioned in steps of Interchange creation, delete the Interchange and transactionset nodes and retain only the group ecs nodes, including both header and trailer. Save this file. 3. Use the same in B2B       These newly created ecs can be used in B2B by 2 ways.              a. By overriding at the trading partner Level:              This will be very useful when the configuration is complete and then need to incorporate the customisation. In this case, just select the Trading partner - document - select the document which need to be customised.              Upload the newly created Interchange and group ECS files under the Interchange and group tabs respectively and re-deply the associated agreement.              The advantage of this approach is              - Flexibility to add customised envelop definitions to the partners              - Save the re-work of design time effort.              b. By adding another document definition in Administration - document screen:              This scenario can be used if there is no configuration done at the trading partner level. Create the required document revision and overtide the Interchange and group ECS files under the Interchange and group tabs respectively. Add the document in Trading partner - document. Create and deploy the agreements

    Read the article

  • How to write a custom solution using a python package, modules etc

    - by morpheous
    I am writing a packacge foobar which consists of the modules alice, bob, charles and david. From my understanding of Python packages and modules, this means I will create a folder foobar, with the following subdirectories and files (please correct if I am wrong) foobar/ __init__.py alice/alice.py bob/bob.py charles/charles.py david/david.py The package should be executable, so that in addition to making the modules alice, bob etc available as 'libraries', I should also be able to use foobar in a script like this: python foobar --args=someargs Question1: Can a package be made executable and used in a script like I described above? Question 2 The various modules will use code that I want to refactor into a common library. Does that mean creating a new sub directory 'foobar/common' and placing common.py in that folder? Question 3 How will the modules foo import the common module ? Is it 'from foobar import common' or can I not use this since these modules are part of the package? Question 4 I want to add logic for when the foobar package is being used in a script (assuming this can be done - I have only seen it done for modules) The code used is something like: if __name__ == "__main__": dosomething() where (in which file) would I put this logic ?

    Read the article

  • Lines-of-code counting for many C# solutions

    - by Eric
    I am currently researching a solution for counting lines of code in C#. I pretty much need a combination of the following two tools: http://richnewman.wordpress.com/2007/07/01/c-and-vbnet-line-count-utility/ http://www.locmetrics.com/index.html My problem is that I need to recursively scan a folder containing a lot of visual studio solutions. So can't really use the first tool without any major work on its code, as it's only able to scan a single solution at a time. But I also need to split the results for each solution, preferably even the contained projects. This disqualifies the second tool I found. I also found NDepend which suffers from the same problem. Do you know of any free tools that do what I need? I am unable to find anything suitable.

    Read the article

  • I'm looking for this graph solution

    - by Ben Fransen
    Hello all, Recently I found a pretty graph when I was browsing the adminskins at ThemeForest and in one of the templates I found a really nice, smooth, clean graph. But I've searching arround but so far without luck finding out which solution is used. And example can be found at: http://enstyled.com/adminus/original/page.html The source of this graph looks like: <table class="stats bar" cellpadding="0" cellspacing="0" width="100%"> <thead> <tr> <td>&nbsp;</td> <th scope="col">02.09</th> <th scope="col">03.09</th> <th scope="col">04.09</th> <th scope="col">05.09</th> <th scope="col">06.09</th> <th scope="col">07.09</th> <th scope="col">08.09</th> <th scope="col">09.09</th> <th scope="col">10.09</th> <th scope="col">11.09</th> <th scope="col">12.09</th> <th scope="col">01.10</th> <th scope="col">02.10</th> <th scope="col">03.10</th> </tr> </thead> <tbody> <tr> <th scope="row">Page views</th> <td>1800</td> <td>900</td> <td>700</td> <td>1200</td> <td>600</td> <td>2800</td> <td>3200</td> <td>500</td> <td>2200</td> <td>1000</td> <td>1200</td> <td>700</td> <td>650</td> <td>800</td> </tr> <tr> <th scope="row">Unique visitors</th> <td>1600</td> <td>650</td> <td>550</td> <td>900</td> <td>500</td> <td>2300</td> <td>2700</td> <td>350</td> <td>1700</td> <td>600</td> <td>1000</td> <td>500</td> <td>400</td> <td>700</td> </tr> </tbody> </table> Can someone please tell me which graphingsolution is used here? Thanks in advance, Ben Fransen

    Read the article

  • JBoss 5.1.0.GA and huge vfs-nested.tmp

    - by Petteri Hietavirta
    I noticed this while running a performance test with JMeter. For first half an hour everything was fine and the /server/all/tmp directory size was around 36M. Then suddenly the tmp directory grew up to 6.1G. The space was taken by jar files inside vfs-nested.tmp. I found https://jira.jboss.org/jira/browse/JBAS-7126 but adding that config but it made no difference.

    Read the article

  • E-mail spam analyzing tools

    - by goran
    I have some mail logs, for which I assume that come from our hosted mail server antivirus: 1, antispam: 1, sanesecurity: 1, chkuser: 1, chkrbl: 1, chkmx: 1, chkptr: 0, greylistlevel: 0, rejectemptyfrom: 1, spamscore: 7.00, redirectspam: 1, maxrcpt: 30, maxdatabytes: 50000000, nightguard: 0, whitelistsigned: 1 (+ info on each message score) as plain text files. I was wondering if anyone knows which tool produce such logs and if there are any tools that would parse and analyze the logs?

    Read the article

  • "error module dav_svn does not exist"

    - by chris12892
    I accidentally deleted the mod_svn on my webserver, and now I am stuck. Everything I try to do anything with it (remove it or reinstall it with apt-get), I get that message and apt fails. I know I could reinstall Apache, but I am trying to avoid that at all costs (unless I can do it in such a way that would keep my config files). Any ideas on how to deal with this?

    Read the article

  • how to install software in a different location?

    - by studiohack
    When I am installing software, I usually like to choose where it installs to, other than C:/Program Files...However, from time to time, I come across software that does not let you choose the install location...how can I get around this and choose the install location? This is handy in situations such as multiple partitions or separate system and data partitions, etc... Any ideas? Thanks! (using Windows 7)

    Read the article

  • What is a 8086 relocatable?

    - by gerrit
    I'm running some Fortran software (LBLRTM) and a shell-script that prepares input generates a number of files with names TAPE3, TAPE4, etc. For debugging purposes, I used file to identify the file type. file tells me: TAPE3: 8086 relocatable (Microsoft) My guess is that file is wrong here, and that it's just a binary file that happens to look like a 8086 relocatable. But what is a 8086 relocatable?

    Read the article

  • Changing default install path in Windows 7

    - by brack
    I'm trying to change the default install path in Windows 7 Pro x64. I'm in the Registry Editor and in the HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion key. I know I need to change ProgramFilesDir and ProgramFilesDir (x86). What I'm not sure about is the CommonFilesDir, CommonFilesDir (x86) and the ProgramW6432Dir. Do I need to or should I change these values to reflect the new program files directory I'm using? I tried Googling ProgramW6432Dir and didn't get much help.

    Read the article

  • Chrome: Save as dialogue creates temp file in download directory, I want to change this location

    - by Gabardine
    I've set the download directory for Chrome to be my desktop, but that means that whenever I want to "Save As" it creates a temp file on my desktop until I select the final download destination and close the dialogue. This is deeply frustrating since I browse windowed and I keep seeing the damn things pop up and disappear in the corner of my eye, is there any way to change the directory in which temp files are created in this manner?

    Read the article

  • SWFUpload multiple files server-side handling

    - by Chau
    I need the user to be able to upload multiple files to my server, thus I am using the SWFUpload utility. SWFUpload sends the files one by one, and I need to store them all in the same temporary directory. My ASP.NET handler recieves the files one by one and I can store the file appropriately. My problem is: How do I know which files belong to the same upload? Rephrased, how do I connect the files in my handler?

    Read the article

  • Mercurial hook to disallow committing large binary files

    - by hekevintran
    I want to have a Mercurial hook that will run before committing a transaction that will abort the transaction if a binary file being committed is greater than 1 megabyte. I found the following code which works fine except for one problem. If my changeset involves removing a file, this hook will throw an exception. The hook (I'm using pretxncommit = python:checksize.newbinsize): from mercurial import context, util from mercurial.i18n import _ import mercurial.node as dpynode '''hooks to forbid adding binary file over a given size Ensure the PYTHONPATH is pointing where hg_checksize.py is and setup your repo .hg/hgrc like this: [hooks] pretxncommit = python:checksize.newbinsize pretxnchangegroup = python:checksize.newbinsize preoutgoing = python:checksize.nopull [limits] maxnewbinsize = 10240 ''' def newbinsize(ui, repo, node=None, **kwargs): '''forbid to add binary files over a given size''' forbid = False # default limit is 10 MB limit = int(ui.config('limits', 'maxnewbinsize', 10000000)) tip = context.changectx(repo, 'tip').rev() ctx = context.changectx(repo, node) for rev in range(ctx.rev(), tip+1): ctx = context.changectx(repo, rev) print ctx.files() for f in ctx.files(): fctx = ctx.filectx(f) filecontent = fctx.data() # check only for new files if not fctx.parents(): if len(filecontent) > limit and util.binary(filecontent): msg = 'new binary file %s of %s is too large: %ld > %ld\n' hname = dpynode.short(ctx.node()) ui.write(_(msg) % (f, hname, len(filecontent), limit)) forbid = True return forbid The exception: $ hg commit -m 'commit message' error: pretxncommit hook raised an exception: apps/helpers/templatetags/include_extends.py@bced6272d8f4: not found in manifest transaction abort! rollback completed abort: apps/helpers/templatetags/include_extends.py@bced6272d8f4: not found in manifest! I'm not familiar with writing Mercurial hooks, so I'm pretty confused about what's going on. Why does the hook care that a file was removed if hg already knows about it? Is there a way to fix this hook so that it works all the time? Update (solved): I modified the hook to filter out files that were removed in the changeset. def newbinsize(ui, repo, node=None, **kwargs): '''forbid to add binary files over a given size''' forbid = False # default limit is 10 MB limit = int(ui.config('limits', 'maxnewbinsize', 10000000)) ctx = repo[node] for rev in xrange(ctx.rev(), len(repo)): ctx = context.changectx(repo, rev) # do not check the size of files that have been removed # files that have been removed do not have filecontexts # to test for whether a file was removed, test for the existence of a filecontext filecontexts = list(ctx) def file_was_removed(f): """Returns True if the file was removed""" if f not in filecontexts: return True else: return False for f in itertools.ifilterfalse(file_was_removed, ctx.files()): fctx = ctx.filectx(f) filecontent = fctx.data() # check only for new files if not fctx.parents(): if len(filecontent) > limit and util.binary(filecontent): msg = 'new binary file %s of %s is too large: %ld > %ld\n' hname = dpynode.short(ctx.node()) ui.write(_(msg) % (f, hname, len(filecontent), limit)) forbid = True return forbid

    Read the article

< Previous Page | 71 72 73 74 75 76 77 78 79 80 81 82  | Next Page >