Search Results

Search found 3604 results on 145 pages for 'bigced 21'.

Page 75/145 | < Previous Page | 71 72 73 74 75 76 77 78 79 80 81 82  | Next Page >

  • error: gnome.h: No such file or directory

    - by michael
    I would like to resolve this 'error: gnome.h: No such file or directory ' on ubuntu. I get this error: /bin/sh: gnome-config: not found In file included from TestMDI.cpp:18: ../../../../dist/include/system_wrappers/gnome.h:3:24: error: gnome.h: No such file or directory From this: http://ubuntuforums.org/showthread.php?t=295105 I tried: $ sudo apt-get install libgnomeui-dev Reading package lists... Done Building dependency tree Reading state information... Done libgnomeui-dev is already the newest version. 0 upgraded, 0 newly installed, 0 to remove and 21 not upgraded. But that still does not resolve my problem. Any idea how to fix it?

    Read the article

  • Why am I getting unexpected output trying to write a hash structure to a file?

    - by Harm De Weirdt
    I have a hash in which I store the products a customer buys (%orders). It uses the product code as key and has a reference to an array with the other info as value. At the end of the program, I have to rewrite the inventory to the updated version (i.e. subtract the quantity of the bought items) This is how I do rewrite the inventory: sub rewriteInventory{ open(FILE,'>inv.txt'); foreach $key(%inventory){ print FILE "$key\|$inventory{$key}[0]\|$inventory{$key}[1]\|$inventory{$key}[2]\n" } close(FILE); } where $inventory{$key}[x] is 0 → Title, 1 → price, 2 → quantity. The problem here is that when I look at inv.txt afterwards, I see things like this: CD-911|Lady Gaga - The Fame|15.99|21 ARRAY(0x145030c)||| BOOK-1453|The Da Vinci Code - Dan Brown|14.75|12 ARRAY(0x145bee4)||| Where do these ARRAY(0x145030c)||| entries come from? Or more important, how do I get rid of them?

    Read the article

  • ec2_bundle_vol fails with error LoadError

    - by Koran
    Hi, I am a newbie in amazon ec2 setup. I have now setup a machine to my taste - and I now want to bundle it. I am running the following command from the launched instance - root@domU-21-34-67-26-ED-Z4:~# ec2-bundle-vol -r i386 -d /mnt \ -p ACT-VOL -u 8940-1355-4155 -k /tmp/pk-key.pem \ -c /tmp/cert.pem -s 10240 \ -e /mnt,/root/.ssh,/home/ubuntu/.ssh ruby: No such file or directory -- /home/ubuntu/ec2tools/ec2-api-tools-1.3-46266/lib/ec2/amitools/bundlevol.rb (LoadError) The ruby version is 1.8.7. I searched internet and installed libruby1.8-extras etc too, but to no avail. I also tried running it from site_ruby (/usr/local/lib/site_ruby) - but no use. I tried installing 1.8.6 version of ruby, but was unable to find a way to do so too. Any help would be much appreciated. Thanks, K

    Read the article

  • Crash in Reachability API

    - by Marc
    I'm using the Reachability sample code provided by Apple to check the network connectivity and get notified of changes (Reachability Sample code). I had a look at some crash locks of my app. It seems that some crashes are due to the Reachability/SystemConfiguration Reachablity API stuff (see below). SCNetworkReachabilityGetFlags is only used in the Reachability class provided by Apple. Or am I misinterpreting the crash log? Exception Type: EXC_CRASH (SIGABRT) Exception Codes: 0x00000000, 0x00000000 Crashed Thread: 0 Thread 0 Crashed: 0 libSystem.B.dylib 0x00000e70 mach_msg_trap + 20 1 libSystem.B.dylib 0x00003354 mach_msg + 60 2 SystemConfiguration 0x0001f480 configopen + 168 3 SystemConfiguration 0x00004d08 SCDynamicStoreCreateWithOptions + 272 4 SystemConfiguration 0x00004e54 SCDynamicStoreCreate + 24 5 SystemConfiguration 0x00015244 updateReachabilityStoreInfo + 152 6 SystemConfiguration 0x00016f04 updateCommCenterStatus + 32 7 SystemConfiguration 0x00017678 checkAddress + 1368 8 SystemConfiguration 0x0001a260 __SCNetworkReachabilityGetFlags + 1992 9 SystemConfiguration 0x0001b00c rlsPerform + 132 10 CoreFoundation 0x00058266 __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 8 11 CoreFoundation 0x00028692 __CFRunLoopDoSources0 + 214 12 CoreFoundation 0x00027f62 __CFRunLoopRun + 258 13 CoreFoundation 0x00027d74 CFRunLoopRunSpecific + 220 14 CoreFoundation 0x00027c82 CFRunLoopRunInMode + 54 15 GraphicsServices 0x00004e84 GSEventRunModal + 188 16 UIKit 0x00004f8c -[UIApplication _run] + 564 17 UIKit 0x000024cc UIApplicationMain + 964 18 MyApp 0x0000f80c main (main.m:21) 19 MyApp 0x0000f77c start + 44

    Read the article

  • How to Remove the Last Week Of a Calendar

    - by Nassign
    I am not sure why other people have not asked this before. But have you notice that the asp:Calendar shows an extra week at the end? For example if the VisibleMonth is set to 2010-03-01 and FirstDayOfWeek to Sunday: It will show 6 weeks. Feb 28 to March 6 March 7 to March 13 March 14 to March 20 March 21 to March 27 March 28 to April 3 April 4 to April 10 I was wondering why Microsoft shows the last Row which is entirely on April. I tried to search the net for a property but it does not seem to be existing. The only solution that I could think of is to override the Pre_Render and check all individual date if they are still within the week of the VisibleDate. But of course that is an extreme checking since each rendering of the control shows it.

    Read the article

  • How to walk through two files simultaneously in Perl?

    - by Alex Reynolds
    I have two text files that contain columnar data of the variety position-value. Here is an example of the first file (file A): 100 1 101 1 102 0 103 2 104 1 ... Here is an example of the second file (B): 20 0 21 0 ... 100 2 101 1 192 3 193 1 ... Instead of reading one of the two files into a hash table, which is prohibitive due to memory constraints, what I would like to do is walk through two files simultaneously, in a stepwise fashion. What this means is that I would like to stream through lines of either A or B and compare position values. If the two positions are equal, then I perform a calculation on the values associated with that position. Otherwise, if the positions are not equal, I move through lines of file A or file B until the positions are equal (when I again perform my calculation) or I reach EOF of both files. Is there a way to do this in Perl?

    Read the article

  • NSNumber >= 13 won't retain. Everything else will.

    - by jkap
    The code I'm currently working on requires adding an NSNumber object to an array. All of the NSNumbers with value 0-12 are added fine, but 13 onward causes a EXC_BAD_ACCESS. I turned on NSZombieEnabled and am now getting *** -[CFNumber retain]: message sent to deallocated instance 0x3c78420. Here's the call stack: #0 0x01eac3a7 in ___forwarding___ #1 0x01e886c2 in __forwarding_prep_0___ #2 0x01e3f988 in CFRetain #3 0x01e4b586 in _CFArrayReplaceValues #4 0x0002a2f9 in -[NSCFArray insertObject:atIndex:] #5 0x0002a274 in -[NSCFArray addObject:] #6 0x00010a3b in -[Faves addObject:] at Faves.m:24 #7 0x000062ff in -[ShowController processFave] at ShowController.m:458 #8 0x002af405 in -[UIApplication sendAction:to:from:forEvent:] #9 0x00312b4e in -[UIControl sendAction:to:forEvent:] #10 0x00314d6f in -[UIControl(Internal) _sendActionsForEvents:withEvent:] #11 0x00313abb in -[UIControl touchesEnded:withEvent:] #12 0x002c8ddf in -[UIWindow _sendTouchesForEvent:] #13 0x002b27c8 in -[UIApplication sendEvent:] #14 0x002b9061 in _UIApplicationHandleEvent #15 0x02566d59 in PurpleEventCallback #16 0x01e83b80 in CFRunLoopRunSpecific #17 0x01e82c48 in CFRunLoopRunInMode #18 0x02565615 in GSEventRunModal #19 0x025656da in GSEventRun #20 0x002b9faf in UIApplicationMain #21 0x00002498 in main at main.m:14 If it wasn't isolated to NSNumbers of a certain range, I'd assume I screwed something up with my memory management, but I've just got no idea. Any ideas? Thanks, Josh

    Read the article

  • How do I average the difference between specific values in TSQL?

    - by jvenema
    Hey folks, sorry this is a bit of a longer question... I have a table with the following columns: [ChatID] [User] [LogID] [CreatedOn] [Text] What I need to find is the average response time for a given user id, to another specific user id. So, if my data looks like: [1] [john] [20] [1/1/11 3:00:00] [Hello] [1] [john] [21] [1/1/11 3:00:23] [Anyone there?] [1] [susan] [22] [1/1/11 3:00:43] [Hello!] [1] [susan] [23] [1/1/11 3:00:53] [What's up?] [1] [john] [24] [1/1/11 3:01:02] [Not much] [1] [susan] [25] [1/1/11 3:01:08] [Cool] ...then I need to see that Susan has an average response time of (20 + 6) / 2 = 13 seconds to John, and John has an average of (9 / 1) = 9 seconds to Susan. I'm not even sure this can be done in set-based logic, but if anyone has any ideas, they'd be much appreciated!

    Read the article

  • Math: How to sum each row of a matrix

    - by macek
    I have a 1x8 matrix of students where each student is a 4x1 matrix of scores. Something like: SCORES S [62, 91, 74, 14] T [59, 7 , 59, 21] U [44, 9 , 69, 6 ] D [4 , 32, 28, 53] E [78, 99, 53, 83] N [48, 86, 89, 60] T [56, 71, 15, 80] S [47, 67, 79, 40] Main question: Using sigma notation, or some other mathematical function, how can I get a 1x8 matrix where each student's scores are summed? # expected result TOTAL OF SCORES S [241] T [146] U [128] D [117] E [313] N [283] T [222] S [233] Sub question. To get the average, I will multiply the matrix by 1/4. Would there be a quicker way to get the final result? AVERAGE SCORE S [60.25] T [36.50] U [32.00] D [29.25] E [78.25] N [70.75] T [55.50] S [58.25] Note: I'm not looking for programming-related algorithms here. I want to know if it is possible to represent this with pure mathematical functions alone.

    Read the article

  • Plugin or module for filtering/sorting a large amount of data?

    - by prometheus
    I have a rather large amount of data (100 MB or so), that I would like to present to a user. The format of the data is similar to the following... Date              Location      Log File          Link 03/21/2010   San Diego   some_log.txt   http://somelink.com etc My problem is that I would like to have some nice/slick way for the user to filter the information. Unfortunately, because there is so much of it, the jQuery Table Filter plugin does not work (crashes the browser). I was wondering if there is a nice solution or if I have to simply do the filtering on the server end and have a bland pull-down menu / select-box interface for the client to use.

    Read the article

  • excanvas throws me errors in IE

    - by oshafran
    Hi I am using excanvas.js that is used with flot jquery graphs. When in FF or chrome the graphs show great. on IE I get this error: Unknown runtime error excanvas.min.js, line 144 character 21 el.getContext = getContext; // Remove fallback content. There is no way to hide text nodes so we // just remove all childNodes. We could hide all elements and remove // text nodes but who really cares about the fallback content. el.innerHTML = ''; el in the stack is DispHTMLUnknownElement What can that be? Thank you?

    Read the article

  • WPF Textblock Convert Issue

    - by deep
    am usina text block in usercontrol, but am sending value to textblock from other form, when i pass some value it viewed in textblock, but i need to convert the number to text. so i used converter in textblock. but its not working <TextBlock Height="21" Name="txtStatus" Width="65" Background="Bisque" TextAlignment="Center" Text="{Binding Path=hM1,Converter={StaticResource TextConvert},Mode=OneWay}"/> converter class class TextConvert : IValueConverter { public object Convert(object value, Type targetType, object parameter, CultureInfo culture) { if (value != null) { if (value.ToString() == "1") { return value = "Good"; } if (value.ToString() == "0") { return value = "NIL"; } } return value = ""; } public object ConvertBack(object value, Type targetType, object parameter, CultureInfo culture) { return (string)value; } } is it right? whats wrong in it??

    Read the article

  • phing FtpDeploy "connection to host failed"

    - by Jorre
    I'm getting the following error when trying to deploy a ZIP file to a remote FTP server. I tried connecting to the server using an FTP client (filezilla) and all goes well. Also, when connecting to a public ftp like ftp.belnet.be connections work fine. I'm trying to send the file to a VSFTPD server behind a router using port forwarding. Again, this works fine from any location using Filezilla, phing is not connecting though... BUILD FAILED /deployment/build.xml:60:12: Could not connect to FTP server x.x.x.x on port 21: Connection to host failed Total time: 2 minutes 30.09 seconds

    Read the article

  • Add title to meta analysis forest plot

    - by Timothy Alston
    I am meta-analysing some studies and drawing a forest plot for my results. However I can`t seem to get the forest plot to display the title. An example of my code is: require(meta) parameter1<-metaprop(sm="PLOGIT", event=c(4,16,3,2,10,1,0,2), n=c(90,402,89,29,153,86,21,48), level = 0.95, studlab=c("study 1", "study 2", "study 3", "study 4", "study 5", "study 6", "study 7", "study 8"), title="meta analysis 1") forest(parameter1) When it produces the forest plot, the title "meta analysis 1" is missing. How can I add this in? Thanks in advance, Timothy

    Read the article

  • SQL Server view: how to add missing rows using interpolation

    - by Christopher Klein
    Running into a problem. I have a table defined to hold the values of the daily treasury yield curve. It's a pretty simple table used for historical lookup of values. There are notibly some gaps in the table on year 4, 6, 8, 9, 11-19 and 21-29. The formula is pretty simple in that to calculate year 4 it's 0.5*Year3Value + 0.5*Year5Value. The problem is how can I write a VIEW that can return the missing years? I could probably do it in a stored procedure but the end result needs to be a view.

    Read the article

  • Optimize SQL with Interbase

    - by Roland Bengtsson
    I was inspired by the good answers from my previous question about SQL. Now this SQL is run on a DB with Interbase 2009. It is about 21 GB in size. SELECT DistanceAsMeters, Bold_Id, Created, AddressFrom.CityName_CO as FromCity, AddressTo.CityName_CO as ToCity FROM AddrDistance LEFT JOIN Address AddressFrom ON AddrDistance.FromAddress = AddressFrom.Bold_Id LEFT JOIN Address AddressTo ON AddrDistance.ToAddress = AddressTo.Bold_Id Where DistanceAsMeters = 0 and PseudoDistanceAsCostKm = 0 and not AddrDistance.bold_id in (select bold_id from DistanceQueryTask) Order By Created Desc There are 840000 rows with AddrDistance 190000 rows with Address and 4 with DistanceQueryTask. The question is, can this be done faster? I guess, the same query is run many times select bold_id from DistanceQueryTask. Note that I'm not interested in stored procedures, just plain SQL :)

    Read the article

  • Android serialization: ImageView

    - by embo
    I have a simple class: public class Ball2 extends ImageView implements Serializable { public Ball2(Context context) { super(context); } } Serialization ok: private void saveState() throws IOException { ObjectOutputStream oos = new ObjectOutputStream(openFileOutput("data", MODE_PRIVATE)); try { Ball2 data = new Ball2(Game2.this); oos.writeObject(data); oos.flush(); } catch (Exception e) { Log.e("write error", e.getMessage(), e); } finally { oos.close(); } } But deserealization private void loadState() throws IOException { ObjectInputStream ois = new ObjectInputStream(openFileInput("data")); try { Ball2 data = (Ball2) ois.readObject(); } catch (Exception e) { Log.e("read error", e.getMessage(), e); } finally { ois.close(); } } fail with error: 03-24 21:52:43.305: ERROR/read error(1948): java.io.InvalidClassException: android.widget.ImageView; IllegalAccessException How deserialize object correctly?

    Read the article

  • Can I get an example please?

    - by Doug
    $starcraft = array( "drone" => array( "cost" => "6_0-", "gas" => "192", "minerals" => "33", "attack" => "123", ) "zealot" => array( "cost" => "5_0-", "gas" => "112", "minerals" => "21", "attack" => "321", ) ) I'm playing with oop and I want to display the information in this array using a class, but I don't know how to construct the class to display it. This is what I have so far, and I don't know where to go from here. Am I supposed to use setters and getters? class gamesInfo($game) { $unitname; $cost; $gas; $minerals; $attack; }

    Read the article

  • Enforcing a query in MySql to use a specific index

    - by Hossein
    Hi, I have large table. consisting of only 3 columns (id(INT),bookmarkID(INT),tagID(INT)).I have two BTREE indexes one for each bookmarkID and tagID columns.This table has about 21 Million records. I am trying to run this query: SELECT bookmarkID,COUNT(bookmarkID) AS count FROM bookmark_tag_map GROUP BY tagID,bookmarkID HAVING tagID IN (-----"tagIDList"-----) AND count >= N which takes ages to return the results.I read somewhere that if make an index in which it has tagID,bookmarkID together, i will get a much faster result. I created the index after some time. Tried the query again, but it seems that this query is not using the new index that I have made.I ran EXPLAIN and saw that it is actually true. My question now is that how I can enforce a query to use a specific index? also comments on other ways to make the query faster are welcome. Thanks

    Read the article

  • Insert rownumber repeatedly in records in t-sql.

    - by jeff
    Hi, I want to insert a row number in a records like counting rows in a specific number of range. example output: RowNumber ID Name 1 20 a 2 21 b 3 22 c 1 23 d 2 24 e 3 25 f 1 26 g 2 27 h 3 28 i 1 29 j 2 30 k I rather to try using the rownumber() over (partition by order by column name) but my real records are not containing columns that will count into 1-3 rownumber. I already try to loop each of record to insert a row count 1-3 but this loop affects the performance of the query. The query will use for the RDL report, that is why as much as possible the performance of the query must be good. any suggestions are welcome. Thanks

    Read the article

  • Is there a good extension for working with SVN in Emacs?

    - by allyourcode
    I've tried psvn.el, but the command to diff the file you're currently looking at is just hideous: M-x svn-file-show-svn-diff. I tried installing vc-svn.el, but couldn't get that working on my version of Emacs: GNU Emacs 21.3.1 (i386-mingw-nt5.1.2600) of 2004-03-10 on NYAUMO. I've tried putting a copy of vc-snv.el in my site-lisp dir, but when I try to run the command "M-x vc-diff" it says my file "is not under version control". The emacs wiki page, which mainly focuses on vc-svn.el, seems to be horribly out of date, as many of the links do not work.

    Read the article

  • What statistics can be maintained for a set of numerical data without iterating?

    - by Dan Tao
    Update Just for future reference, I'm going to list all of the statistics that I'm aware of that can be maintained in a rolling collection, recalculated as an O(1) operation on every addition/removal (this is really how I should've worded the question from the beginning): Obvious Count Sum Mean Max* Min* Median** Less Obvious Variance Standard Deviation Skewness Kurtosis Mode*** Weighted Average Weighted Moving Average**** OK, so to put it more accurately: these are not "all" of the statistics I'm aware of. They're just the ones that I can remember off the top of my head right now. *Can be recalculated in O(1) for additions only, or for additions and removals if the collection is sorted (but in this case, insertion is not O(1)). Removals potentially incur an O(n) recalculation for non-sorted collections. **Recalculated in O(1) for a sorted, indexed collection only. ***Requires a fairly complex data structure to recalculate in O(1). ****This can certainly be achieved in O(1) for additions and removals when the weights are assigned in a linearly descending fashion. In other scenarios, I'm not sure. Original Question Say I maintain a collection of numerical data -- let's say, just a bunch of numbers. For this data, there are loads of calculated values that might be of interest; one example would be the sum. To get the sum of all this data, I could... Option 1: Iterate through the collection, adding all the values: double sum = 0.0; for (int i = 0; i < values.Count; i++) sum += values[i]; Option 2: Maintain the sum, eliminating the need to ever iterate over the collection just to find the sum: void Add(double value) { values.Add(value); sum += value; } void Remove(double value) { values.Remove(value); sum -= value; } EDIT: To put this question in more relatable terms, let's compare the two options above to a (sort of) real-world situation: Suppose I start listing numbers out loud and ask you to keep them in your head. I start by saying, "11, 16, 13, 12." If you've just been remembering the numbers themselves and nothing more, and then I say, "What's the sum?", you'd have to think to yourself, "OK, what's 11 + 16 + 13 + 12?" before responding, "52." If, on the other hand, you had been keeping track of the sum yourself while I was listing the numbers (i.e., when I said, "11" you thought "11", when I said "16", you thought, "27," and so on), you could answer "52" right away. Then if I say, "OK, now forget the number 16," if you've been keeping track of the sum inside your head you can simply take 16 away from 52 and know that the new sum is 36, rather than taking 16 off the list and them summing up 11 + 13 + 12. So my question is, what other calculations, other than the obvious ones like sum and average, are like this? SECOND EDIT: As an arbitrary example of a statistic that (I'm almost certain) does require iteration -- and therefore cannot be maintained as simply as a sum or average -- consider if I asked you, "how many numbers in this collection are divisible by the min?" Let's say the numbers are 5, 15, 19, 20, 21, 25, and 30. The min of this set is 5, which divides into 5, 15, 20, 25, and 30 (but not 19 or 21), so the answer is 5. Now if I remove 5 from the collection and ask the same question, the answer is now 2, since only 15 and 30 are divisible by the new min of 15; but, as far as I can tell, you cannot know this without going through the collection again. So I think this gets to the heart of my question: if we can divide kinds of statistics into these categories, those that are maintainable (my own term, maybe there's a more official one somewhere) versus those that require iteration to compute any time a collection is changed, what are all the maintainable ones? What I am asking about is not strictly the same as an online algorithm (though I sincerely thank those of you who introduced me to that concept). An online algorithm can begin its work without having even seen all of the input data; the maintainable statistics I am seeking will certainly have seen all the data, they just don't need to reiterate through it over and over again whenever it changes.

    Read the article

  • ProFTPd server on Ubuntu getting access denied message when successfully authenticated?

    - by exxoid
    I have a Ubuntu box with a ProFTPD 1.3.4a Server, when I try to log in via my FTP Client I cannot do anything as it does not allow me to list directories; I have tried logging in as root and as a regular user and tried accessing different paths within the FTP Server. The error I get in my FTP Client is: Status: Retrieving directory listing... Command: CDUP Response: 250 CDUP command successful Command: PWD Response: 257 "/var" is the current directory Command: PASV Response: 227 Entering Passive Mode (172,16,4,22,237,205). Command: MLSD Response: 550 Access is denied. Error: Failed to retrieve directory listing Any idea? Here is the config of my proftpd: # # /etc/proftpd/proftpd.conf -- This is a basic ProFTPD configuration file. # To really apply changes, reload proftpd after modifications, if # it runs in daemon mode. It is not required in inetd/xinetd mode. # # Includes DSO modules Include /etc/proftpd/modules.conf # Set off to disable IPv6 support which is annoying on IPv4 only boxes. UseIPv6 off # If set on you can experience a longer connection delay in many cases. IdentLookups off ServerName "Drupal Intranet" ServerType standalone ServerIdent on "FTP Server ready" DeferWelcome on # Set the user and group that the server runs as User nobody Group nogroup MultilineRFC2228 on DefaultServer on ShowSymlinks on TimeoutNoTransfer 600 TimeoutStalled 600 TimeoutIdle 1200 DisplayLogin welcome.msg DisplayChdir .message true ListOptions "-l" DenyFilter \*.*/ # Use this to jail all users in their homes # DefaultRoot ~ # Users require a valid shell listed in /etc/shells to login. # Use this directive to release that constrain. # RequireValidShell off # Port 21 is the standard FTP port. Port 21 # In some cases you have to specify passive ports range to by-pass # firewall limitations. Ephemeral ports can be used for that, but # feel free to use a more narrow range. # PassivePorts 49152 65534 # If your host was NATted, this option is useful in order to # allow passive tranfers to work. You have to use your public # address and opening the passive ports used on your firewall as well. # MasqueradeAddress 1.2.3.4 # This is useful for masquerading address with dynamic IPs: # refresh any configured MasqueradeAddress directives every 8 hours <IfModule mod_dynmasq.c> # DynMasqRefresh 28800 </IfModule> # To prevent DoS attacks, set the maximum number of child processes # to 30. If you need to allow more than 30 concurrent connections # at once, simply increase this value. Note that this ONLY works # in standalone mode, in inetd mode you should use an inetd server # that allows you to limit maximum number of processes per service # (such as xinetd) MaxInstances 30 # Set the user and group that the server normally runs at. # Umask 022 is a good standard umask to prevent new files and dirs # (second parm) from being group and world writable. Umask 022 022 # Normally, we want files to be overwriteable. AllowOverwrite on # Uncomment this if you are using NIS or LDAP via NSS to retrieve passwords: # PersistentPasswd off # This is required to use both PAM-based authentication and local passwords AuthPAMConfig proftpd AuthOrder mod_auth_pam.c* mod_auth_unix.c # Be warned: use of this directive impacts CPU average load! # Uncomment this if you like to see progress and transfer rate with ftpwho # in downloads. That is not needed for uploads rates. # UseSendFile off TransferLog /var/log/proftpd/xferlog SystemLog /var/log/proftpd/proftpd.log # Logging onto /var/log/lastlog is enabled but set to off by default #UseLastlog on # In order to keep log file dates consistent after chroot, use timezone info # from /etc/localtime. If this is not set, and proftpd is configured to # chroot (e.g. DefaultRoot or <Anonymous>), it will use the non-daylight # savings timezone regardless of whether DST is in effect. #SetEnv TZ :/etc/localtime <IfModule mod_quotatab.c> QuotaEngine off </IfModule> <IfModule mod_ratio.c> Ratios off </IfModule> # Delay engine reduces impact of the so-called Timing Attack described in # http://www.securityfocus.com/bid/11430/discuss # It is on by default. <IfModule mod_delay.c> DelayEngine on </IfModule> <IfModule mod_ctrls.c> ControlsEngine off ControlsMaxClients 2 ControlsLog /var/log/proftpd/controls.log ControlsInterval 5 ControlsSocket /var/run/proftpd/proftpd.sock </IfModule> <IfModule mod_ctrls_admin.c> AdminControlsEngine off </IfModule> # # Alternative authentication frameworks # #Include /etc/proftpd/ldap.conf #Include /etc/proftpd/sql.conf # # This is used for FTPS connections # #Include /etc/proftpd/tls.conf # # Useful to keep VirtualHost/VirtualRoot directives separated # #Include /etc/proftpd/virtuals.con # A basic anonymous configuration, no upload directories. # <Anonymous ~ftp> # User ftp # Group nogroup # # We want clients to be able to login with "anonymous" as well as "ftp" # UserAlias anonymous ftp # # Cosmetic changes, all files belongs to ftp user # DirFakeUser on ftp # DirFakeGroup on ftp # # RequireValidShell off # # # Limit the maximum number of anonymous logins # MaxClients 10 # # # We want 'welcome.msg' displayed at login, and '.message' displayed # # in each newly chdired directory. # DisplayLogin welcome.msg # DisplayChdir .message # # # Limit WRITE everywhere in the anonymous chroot # <Directory *> # <Limit WRITE> # DenyAll # </Limit> # </Directory> # # # Uncomment this if you're brave. # # <Directory incoming> # # # Umask 022 is a good standard umask to prevent new files and dirs # # # (second parm) from being group and world writable. # # Umask 022 022 # # <Limit READ WRITE> # # DenyAll # # </Limit> # # <Limit STOR> # # AllowAll # # </Limit> # # </Directory> # # </Anonymous> # Include other custom configuration files Include /etc/proftpd/conf.d/ UseReverseDNS off <Global> RootLogin on UseFtpUsers on ServerIdent on DefaultChdir /var/www DeleteAbortedStores on LoginPasswordPrompt on AccessGrantMsg "You have been authenticated successfully." </Global> Any idea what could be wrong? Thanks for your help!

    Read the article

  • How to get ouput from expect

    - by Mallikarjunarao
    i wrote a script for spawing the bc command package require Expect proc bc {eq} { spawn e:/GnuWin32/bc/bin/bc send "$eq\r" expect -re "(.*)\r" return "$expect_out(0,string)" } set foo "9487294387234/sqrt(394872394879847293847)" puts "the valule [bc $foo]" how to get the output from this. When i am running this one i get ouput like this bc 1.06 Copyright 1991-1994, 1997, 1998, 2000 Free Software Foundation, Inc. This is free software with ABSOLUTELY NO WARRANTY. For details type `warranty'. 9487294387234/sqrt(394872394879847293847) 477 can't read "expect_out(0,string)": no such element in array while executing "return "The values is $expect_out(0,string)"" (procedure "bc" line 6) invoked from within "bc $foo" invoked from within "puts "the valule [bc $foo]"" (file "bc.tcl" line 21) how to resolve this one.

    Read the article

  • Google Sitemap and Robots.txt Issue

    - by Sarfaraz Soomro
    Hi, We have a sitemap at our site, http://www.gamezebo.com/sitemap.xml Some of the urls in the sitemap, are being reported in the webmaster central as being blocked by our robots.txt, see, gamezebo.com/robots.txt ! Although these urls are not Disallowed in Robots.txt. There are other such urls aswell, for example, gamezebo.com/gamelinks is present in our sitemap, but it's being reported as "URL restricted by robots.txt". Also I have this parse result in the Webmaster Central that says, "Line 21: Crawl-delay: 10 Rule ignored by Googlebot". What does it mean? I appreciate your help, Thanks.

    Read the article

< Previous Page | 71 72 73 74 75 76 77 78 79 80 81 82  | Next Page >