Search Results

Search found 9788 results on 392 pages for 'character limit'.

Page 15/392 | < Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >

  • Why is there a limit of max 20 parameters to a clojure function

    - by GuyC
    Hi, there seems to be a limit to the number of parameters a clojure function can take. When defining a function with more than 20 parameters I receive the following: Obviously this can be avoided, but I was hitting this limit porting the execution model of an existing DSL to clojure, and I have constructs in my DSL like the following, which by macro expansion can be mapped to functions quite easily except for this limit: (defAlias nn1 ((element ?e1) (element ?e2)) number "@doc features of the elements are calculated for entry into the first neural network, the result is the score computed by the latter" (nn1-recall (nn1-feature00 ?e1 ?e2) (nn1-feature01 ?e1 ?e2) ... (nn1-feature89 ?e1 ?e2))) which is a DSL statement to call a neural network with 90 input nodes. Can work around it of course, but was wondering where the limit comes from. Thanks.

    Read the article

  • Apply limit in mapreduce function in php?

    - by Rohan Kumar
    How to apply limit in php, mongodb when using mapreduce function? I tried this $cmd=array(// codition array "mapreduce" => "user", "map" => $map, "reduce" => $reduce, "out" => array("inline" => 1), "limit"=>2 ); $db=connect(); $query = $db->command($cmd);// run command But its not working it gives 2 documents.I can't use limit on sub documents. If I have 100's of sub documents and then I want paging in sub documents.Then it fails.Is it possible to apply limit on sub documents?

    Read the article

  • Best Workaround with LIMIT subquery MySQL

    - by Hiyasat
    Hi all, i want to create Stored PROCEDURE with multi statement, and it not working , and Google the problem and found that mysql dose not support Subquery statement "MySQL doesn't yet support 'LIMIT & IN/ALL/ANY/SOME subquery'" My statement like this: DROP PROCEDURE IF EXISTS proc_Name; CREATE PROCEDURE `DBName`.`proc_Name`() BEGIN SELECT FROM table1 WHERE ORDER BY table1_Colom LIMIT 100; UPDATE table2 SET table2_colom1 = 1 WHERE ID IN (SELECT ID FROM table2 ORDER BY table2_colom1 LIMIT 100); END ; Thanks in Advanced

    Read the article

  • Is there a list of language only character regions for UTF-8 somewhere?

    - by Brehtt
    I'm trying to analyze some UTF-8 encoded documents in a way that recognizes different language characters. For my approach to work I need to ignore non-language characters, such as control characters, mathematical symbols etc. Just trying to dissect the basic Latin section of the UTF standard has resulted in multiple regions, with characters like the division symbol being right in the middle of a range of valid Latin characters. Is there a list somewhere that identifies these regions? Or better yet, a Regex that defines the regions or something in C# that can identify the different characters?

    Read the article

  • Limit number of views per day in Django

    - by ariddell
    Is there an easy way to limit the number of times a view can be accessed by a given IP address per day/week? A simplified version of the technique used by some booksellers to limit the number of pages of a book you can preview? There's only one view that this limit need apply to--i.e. it's not a general limit--and it would be nice if I could just have a variable overlimit in the template context. The solution need not be terribly robust, but limiting by IP address seemed like a better idea than using a cookie. I've looked into the session middleware but it doesn't make any references to tracking IP addresses as far as I can tell. Has anyone encountered this problem?

    Read the article

  • limit number of characters entered in textarea

    - by Abu Hamzah
    here is the script does what i want but not exactly, my question is, how can i stop user entering text once it reached the lmit of 255 characters? var limit = 255; var txt = $('textarea[id$=txtPurpose]'); $(txt).keyup(function() { var len = $(this).val().length; if (len > limit) { //this.value = this.value.substring(0, 50); $(this).addClass('goRed'); $('#spn').text(len - limit + " characters exceeded"); return false; } else { $(this).removeClass('goRed'); $('#spn').text(limit - len + " characters left"); } }); if there is a better way please let me know.

    Read the article

  • How to correct character encoding in IE8 native json ?

    - by mike_t2e
    I am using json with unicode text, and having a problem with the IE8 native json implementation. <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <script> var stringified = JSON.stringify("?????? olé"); alert(stringified); </script> Using json2.js or FireFox native json, the alert() string is the same as in the original one. IE8 on the other hand returns Unicode values rather than the original text \u0e2a\u0e27\u0e31\u0e2a\u0e14\u0e35 ol\u00e9 . Is there an easy way to make IE behave like the others, or convert this string to how it should be ? And would you regard this as a bug in IE, I thought native json implementations were supposed to be drop-in identical replacements for json2.js ?

    Read the article

  • Japanese character stored in SQL Server DB using ASP page that assumed it as ISO-8859-1 encoding

    - by Vishal Seth
    We have a legacy ASP based product that allowed the UI and Data languages of user groups to be configured according to their locations. CodePage and CharSet in ASP pages collecting data was set accordingly. I've noticed few instances in the SQL Server DB where users posted Japanese characters in the ASP page that assumes the oncoming stream to be of ISO-8859-1/Western and as a result, the data in the SQL table has gobbled up. While upgrading the client to our new product, I want to back-convert those "garbage" Japanese (in some instances Chinese) characters back to their actual form. Can I create some utility ASP page that would go through such data values and "fix" the wrongly-encoded strings and store everything back as utf-8 strings? In any case, I don't want to affect my French/Spanish/English characters that might be there as well.

    Read the article

  • Raising hard limit on RLIMIT_NOFILE system-wide on Linux

    - by jonswar
    We need to raise RLIMIT_NOFILE when running memcached, as we're hitting the default hard limit (1024). However, raising a hard limit requires root, and for various reasons we don't want to have to run memcached or its containing shell as root. Right now we happily run it as a non-root user. Is there a way to raise the hard limit for RLIMIT_NOFILE system-wide, so that we can continue to run memcached as non-root and simply raise the soft limit? This is RedHat Linux with 2.6 kernel. Thanks! Jon

    Read the article

  • R: How to separate character output in a loop?

    - by John
    I'm blanking on the best way to paste a list of strings together to go into an SQL statement... I'm having trouble with the separator bar | printing at the beginning when I don't want it to: foo = "blah" paste_all_together = NULL for (n in 1:4) { paste_together = paste(foo ,sep = "") paste_all_together = paste(paste_all_together, paste_together, sep = "|") } > paste_all_together [1] "|blah|blah|blah|blah" I just want it to print out "blah|blah|blah|blah". Do I need a nested loop, or is there a better itterator in R for doing this? Or perhaps a better way to input SQL statements?

    Read the article

  • SQL Server 2008 R2 Enterprise database has unexpected 4GB database size limit

    - by Jesse
    I have SQL Server 2008 R2 Enterprise installed on a local Windows 7 x64 workstation. When I create a database on the server, it unexpectedly has a 4GB size limit (Database properties in SQL Server Management Studio say size = 3934.38 MB, space available = 47.13 MB). Unfortunately the database needs more than 4GB, and Enterprise is not supposed to have a practical maximum size. I confirmed the database is on the Enterprise server: SELECT @@VERSIONMicrosoft SQL Server 2008 R2 (RTM) - 10.50.1600.1 (X64) Apr 2 2010 15:48:46 Copyright (c) Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 6.1 <X64> (Build 7600: ) The database file is not set to restrict growth in SQL Server Management Studio, and there is plenty of hard drive space. The database was copied from SQL Express (which has a 4GB limit), but the same occurs with a fresh database creation. I've spent a couple of hours trying to figure this out and Google-searching, to no avail. Any ideas?

    Read the article

  • Limit Windows PC Network/Internet Throughput

    - by Jon Cram
    I have a Vista x64 machine on a fairly fast Internet connection and either buggy drivers for the onboard Ethernet or faulty onboard Ethernet hardware. If I sustain too high a throughput on the Ethernet connection the network connection within Windows fails and I have to restart the machine to restore connectivity. I don't believe I can fix this issue (I'm erring towards faulty hardware) but would like to mitigate the effects by limiting my network throughput. I'm in a position where I would like to download a 5GB file from the Internet (a game install via Steam) and am certain that as this will take a few hours I will not be able to complete the download before my network connection within Windows fails. From downloading content through a BitTorrent client I have found that by limiting the download throughput to around 150 kilobytes per second I can maintain a steady network connection. I can't directly limit the throughput of the download through the Steam client and would instead like to find out how I can limit the throughput of my Ethernet connection within Windows. Any suggestions on how I can achieve this?

    Read the article

  • limit linux background flush (dirty pages)

    - by korkman
    Background flushing in linux happens when either too much written data is pending (adjustable via /proc/sys/vm/dirty_background_ratio) or a timeout for pending writes is reached (/proc/sys/vm/dirty_expire_centisecs). Unless another limit is being hit (/proc/sys/vm/dirty_ratio), more written data may be cached. Further writes will block. In theory, this should create a background process writing out dirty pages without disturbing other processes. In practice, it does disturb any process doing uncached reading or synchronous writing. Badly. This is because the background flush actually writes at 100% device speed and any other device requests at this time will be delayed (because all queues and write-caches on the road are filled). Is there any way to limit the amount of requests per second the flushing process performs, or otherwise effectively prioritize other device I/O?

    Read the article

  • Limit on WMIC requests from a Windows Service

    - by Anders
    Hi all, Does anyone know if there is limit on how many wmic requests Windows can handle simultaneously if they are originating from a Windows service? The reason I'm asking is because my application fails when too many simultaneous requests have been initiated. I don't get any data back from the application. However, If I compile the Python application and run it as a stand alone application all will work fine. The wmic calls are looking like this: subprocess.Popen("wmic path Win32_PerfFormattedData_PerfOS_Memory get CommittedBytes", stdout=subprocess.PIPE, stderr=subprocess.PIPE) This makes me wonder, is there a limit Windows Services and what they can perform? I mean, if the .exe file can handle all requests, then it must be something to do with the fact that I have compiled it as a Windows service.

    Read the article

  • VirtualBox serial port, limit of two?

    - by Evan Carroll
    Using VirtualBox, Is there a limit of two serial ports per VM? I'm currently engaging in a task that requires more than two, but less than or equal to four ports. I'm at a loss of what to do to get COM3 and COM4 up? How do I do it if the Virtual Box GUI doesn't currently support more than two serial host devices? Can I configure them external to the GUI? Is there a hard-cap on two devices? And, if so, for the love of god, please entertain me with the reasoning? I know Windows and Linux don't impose such a limit, it would seem awkward to impose it in virtualization layer. This is for cheap dial-in project with proprietary Windows tools. I'm using a Quatech serial multiport adapter to provide the ports to multimodems.

    Read the article

  • Limit number of simultaneous connections squid makes to a single server

    - by Ben Voigt
    Note: I am asking about outbound concurrent connection limits, not inbound, which is sufficiently covered on existing questions Modern browsers typically open a large number of simultaneous connections, to take advantage of the fact that TCP fairly shares bandwidth between connections. Of course, this doesn't result in fair sharing between users, so some servers have started penalizing hosts which open too many connections. This limit can be configured client-side (e.g. IE MaxConnectionsPerServer, Firefox network.http.max-connections-per-server), but the method differs for each browser and version, and many users aren't competent to adjust it themselves. So we turn to a squid transparent HTTP proxy for central management of HTTP download. How can the number of simultaneous connections from squid to a remote webserver be limited, so the webserver doesn't perceive it as abuse of concurrent connections? Ideally the limit would be per source address. Squid should accept virtually unlimited concurrent requests from the client browser, and issue them sequentially to the remote server, only N at a time, delaying (but not dropping) the others.

    Read the article

  • rsync server side limit bandwidth/connection

    - by c2h2
    In a VOIP application, I have upto 3000 clients rsync audio files from there linux server in a daily, server is placed at a data center (10Mbps in/out bound), the server works as a VOIP sip server running FreeSWITCH (low ping latency should be ensured.) Therefore I would like to have server side control of rsync which controls: Limit total outbound bandwidth. Limit total number of connections. (Reject clients while at max number of connection and let it retry after a specific time frame.) OPTIONAL: list/kill individual connections. Normally I would use ssh + rsync + pem_keys with some extra options, but above requirements are not feasible by simple command lines. Can anyone point me some direction. or show some scripts/tools? I would also probably integrate them and release on github. Thanks!

    Read the article

  • Request exceeded the limit of 10

    - by Webnet
    My logs are FULL of [Tue Jan 11 10:20:45 2011] [error] [client 99.162.115.123] Request exceeded the limit of 10 internal redirects due to probable configuration error. Use 'LimitInternalRecursion' to increase the limit if necessary. Use 'LogLevel debug' to get a backtrace., referer: https://www.domain.com/vehicles/Chevrolet/Uplander/2006 The problem is when I enable LogLevel debug we get HUGE error logs because all of our traffic is SSL. From what I can tell the file doesn't record these errors anymore, either that or it's so buried in SSL logs that I just can't find them. Here's my .htaccess Options -indexes RewriteEngine On RewriteRule ^battery/([^/]+)$ /browser/product?sku=BATTERY+$1&type=battery RewriteRule ^vehicles/([^/]+)/([^/]+)/([^/]+)/product([0-9]+)$ /browser/index.php?make=$1&model=$2&id=$3&%{QUERY_STRING} [L,NC] RewriteRule ^vehicles/([^/]+)/([^/]+)/([^/]+)/([0-9]+)$ /browser/product.php?make=$1&model=$2&year=$3&id=$4&%{QUERY_STRING} [L,NC] RewriteRule ^vehicles/([^/]+)/([^/]+)/([^/]+)$ /store/product/list.php?make=$1&model=$2&year=$3&%{QUERY_STRING} [L,NC] RewriteRule ^vehicles/([^/]+)/([^/]+)$ /vehicle/make/model/year/list.php?make=$1&model=$2&%{QUERY_STRING} [L,NC] RewriteRule ^vehicles/([^/]+)$ /vehicle/make/model/list.php?make=$1&%{QUERY_STRING} [L,NC]

    Read the article

  • return first non repeating character in a string

    - by Amm Sokun
    I had to solve this question in which given a string, i had to return the first non repeating character present in the string. I solved it using hashtable and wrote a method which takes a constant reference to the string and returns the first non repeating character. However, when there is no non repeating character present in the string, i return -1 and in the main program i check as follows char c = firstNonRepeating( word ); if (static_cast<int> (c) == -1) cout<<"no non repeating character present\n"; else cout<<c<<endl; is that the correct way to return -1 when required character is not present?

    Read the article

  • Table character encoding - exception in application

    - by zgnilec
    I have a code: CREATE TABLE IF NOT EXISTS Person ( name varchar(24) ... ) CHARACTER SET utf8 COLLATE utf8_polish_ci; This works OK in my application, but I read if someone put in name field a string that contains character wchich code is greater than 127, database will use 2 bytes (or more) to store this character. So i think, i will change character set to utf16: CHARACTER SET utf16 COLLATE utf16_polish_ci; But now when I run my application, exception apears: KeyNotFoundException. It apears exactly at these instructions: MySqlCommand komenda = baza.Polaczenie.CreateCommand (); komenda.CommandText = zapytanie; MySqlDataReader dr = komenda.ExecuteReader (); // HERE, at execute reader method if (dr.Read ()) ... 1) Anyone had similar problem? 2) Any idea how to use always 2 bytes/char in database field?

    Read the article

  • How do I read UTF-8 characters via a pointer?

    - by Jen
    Suppose I have UTF-8 content stored in memory, how do I read the characters using a pointer? I presume I need to watch for the 8th bit indicating a multi-byte character, but how exactly do I turn the sequence into a valid Unicode character? Also, is wchar_t the proper type to store a single Unicode character? This is what I have in mind: wchar_t readNextChar (char** p) { char ch = *p++; if (ch & 128) { // This is a multi-byte character, what do I do now? // char chNext = *p++; // ... but how do I assemble the Unicode character? ... } ... }

    Read the article

< Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >