Search Results

Search found 6172 results on 247 pages for 'limit choices to'.

Page 17/247 | < Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • MySQL Memory Limit Windows Server 2003

    - by Matt
    I am running MySQL 5.0.51a on Windows Server 2003 Standard Edition on an HP DL580 G4 with 3GB installed. One of my database tables has grown to 5.3 GB with an index file of 2.5 GB, which I believe is causing MySQL to be slow due to having to constantly load and unload the index file when updates are made to the table. The server itself seems to be performing OK because MySQL is only using about 500MB of memory (there are other apps running on the system, but MySQL uses the most memory). The table is fairly active with new records getting adding all during day but no deletes, ever. The MySQL server has up to 600 connections allowed, but only small number (10 or 20) would actually be writing to this table. I increased the memory limits in MySQL but since the max connections is so high I don't think I can give each connection 1GB without risking a problem. Is there some tuning that would let just certain connections get a lot of memory? So I have started to look for alternatives to avert the crisis I know is coming soon. Some of the options I have: Upgrade to Server 2003 Enterprise to install 64GB of memory. Question: would 32 bit MySQL be able to access more than 2GB? Would that be 2GB per thread? That would still be smaller than the index table size so it might not solve the problem completely, but it would be better than now. Upgrade to Server 200x 64 bit and MySQL 64 bit. Switch to a *nix 64 bit server. If anybody has suggestions for things to do in the meantime, opinions on which way to go, or other things that I have overlooked I would appreciate the help. Thanks

    Read the article

  • Limit NFS block size from server side?

    - by paulw1128
    Is it possible to enforce a maximum rsize/wsize in nfsd? I'm having issues related to IP fragmentation (yes, I'm stuck with NFS-over-UDP, contrary to the warnings in the manpage), and have no practical access to the client mount command (buried in one of many TFTP boot images). http://nfs.sourceforge.net/nfs-howto/ar01s05.html lists a kernel source parameter limiting the maximum block size, but I'm not gong to get away with recompiling the nfsd kernel module so that's not really an option either :-(

    Read the article

  • Host data transfer limit calculations and network protocol headers

    - by UpTheCreek
    OK, this might be a really stupid question, but... I'm building a web app that utilises websockets. There's fairly rapid messaging going on, so I've been taking a look at the network traffic with wireshark, to see if there's any way of reducing the amount of data we are sending over the wire, and hence costs. A typical message has approx 150 byte data payload, and according to wireshark the lower layer stuff takes up about: Ethernet: 14 bytes IP: 20 Bytes TCP: 20 Bytes My question is, are these network headers included in data transfer calculations? What about TCP ACK messages? (another 54 bytes according to wireshark) This may seem petty, but because we have so much messaging going on, and because the payload is a similar size to these headers, it's significant.

    Read the article

  • PHP CPU utilization limit

    - by knightrider
    I have done some research on the net regarding the problem. My questions is NOT how to reduce cpu utilization by improving algorithm or improving the performance by using multitasking or limiting CPU per system user. I have a website where user logs in does some processing and logout. The site uses linux server, php and apache. The problem is that I cant control the amount of CPU allocated to each user. ie I want give a guarantee that a user will get say atleast 5% of CPU (assume total number of users is less than 20). How can I do this? Any solution (A php code, apache server settings, or any out of box soln) is welcomed. Thankyou very much for reading this :)

    Read the article

  • Load balanced proxies to avoid an API request limit

    - by ClickClickClick
    There is a certain API out there which limits the number of requests per day per IP. My plan is to create a bunch of EC2 instances with elastic IPs to sidestep the limitation. I'm familiar with EC2 and am just interested in the configuration of the proxies and a software load balancer. I think I want to run a simple TCP Proxy on each instance and a software load balancer on the machine I will be requesting from. Something that allows the following to return a response from a different IP (round robin, availability, doesn't really matter..) eg. curl http://www.bbc.co.uk -x http://myproxyloadbalancer:port Could anyone recommend a combination of software or even a link to an article that details a pleasing way to pull it off? (My client won't be curl but is proxy aware.. I'll be making the requests from a Ruby script..)

    Read the article

  • Limit access on login and IP in .htaccess

    - by Rob
    There are many examples showing how to use .htaccess to restrict users by login or on their IP address (i.e they are ok from the ones given without authorisation). For some reason the following is never mentioned, yet it seems quite useful. How do I restrict using groups and ip, e.g. group1 can access the page from anywhere group2 can access the page only from certain IP addresses if you are not logged on, you cant see the page, regardless where you are I would like to have all 3 of these working at the same time.

    Read the article

  • Youtube API - How to limit results for pagination?

    - by worchyld
    I want to grab a user's uploads (ie: BBC) and limit the output to 10 per page. Whilst I can use the following URL: http://gdata.youtube.com/feeds/api/users/bbc/uploads/?start-index=1&max-results=10 The above works okay. I want to use the query method instead: The Zend Framework docs: http://framework.zend.com/manual/en/zend.gdata.youtube.html State that I can retrieve videos uploaded by a user, but ideally I want to use the query method to limit the results for a pagination. The query method is on the Zend framework docs (same page as before under the title 'Searching for videos by metadata') and is similar to this: [code] $yt = new Zend_Gdata_YouTube(); $query = $yt-newVideoQuery(); $query-setTime('today'); $query-setMaxResults(10); $videoFeed = $yt-getUserUploads( NULL, $query ); // Output print ''; foreach($videoFeed as $video): print '' . $video-title . ''; endforeach; print ''; [/code] The problem is I can't do $query-setUser('bbc'). I tried setAuthor but this returns a totally different result. Ideally, I want to use the query method to grab the results in a paginated fashion. How do I use the $query method to set my limits for pagination? Thanks.

    Read the article

  • How to get around batch file processing limit

    - by Patrick Cuff
    I have a Windows batch file that processes all the files in a given directory. I have 206,783 files I need to process: for %%f in (*.xml) do call :PROCESS %%f goto :STOP :PROCESS :: do something with the file program.exe %1 > %1.new set /a COUNTER=%COUNTER%+1 goto :EOF :STOP @echo %COUNTER% files processed When I run the batch file, the following output is written: 65535 files processed As part of the processing, an output file is created for each file procesed, with a .new extension. When I do a dir *.new it reports 65,535 files exist. So, it appears my command environment has a hard limit on the number of files it can recognize, and that limit is 64K - 1. Is there a way to extend the command environment to manage more than 64K - 1 files? If not, would a VBScript or JavaScript be able to process all 206,783 files? I'm running on Windows 2003 server, Enterprise Edition, 32-bit. UPDATE It looks like the root cause of my issue was with the built-in Windows "extract" command for ZIP files. The files I have to process were copied from another system via a ZIP file. My server doesn't have a ZIP utility installed, just the native Windows commands. I right-clicked on the ZIP file, and did an "Extract all...", which apparently just extracted the first 65,535 files. I downloaded and installed 7-zip onto my server, unzipped all the files, and my batch script worked as intended.

    Read the article

  • how to limit the number of datas in pdf

    - by udaya
    Hi I am exporting data from php page to word,, there i get 'n' number of datas in each page .... How to set the maximum number of data that a word page can contain ,,,, I want only 20 datas in a single page This is the coding i use to export the data to pdf In mysql_table.php the table for the pdf document is be generated <?php require('mysql_table.php'); class PDF extends PDF_MySQL_Table { function Header() { //Title $this->SetFont('Arial','',18); $this->Cell(0,6,'Country details',0,1,'C'); $this->Ln(10); parent::Header(); } } //Connect to database mysql_connect('localhost','root',''); mysql_select_db('cms'); $pdf=new PDF(); $pdf->AddPage(); //First table: put all columns automatically $pdf->Table("SELECT (SELECT COUNT(*) FROM tblentercountry t2 WHERE t2.dbName <= t1.dbName and dbIsDelete='0') AS SLNO ,dbName as Namee,t3.dbCountry as Country,t4.dbState as State,t5.dbTown as Town FROM tblentercountry t1 join tablecountry as t3, tablestate as t4, tabletown as t5 where t1.dbIsDelete='0' and t1.dbCountryId=t3.dbCountryId and t1.dbStateId=t4.dbStateId and t1.dbTownId=t5.dbTownId order by dbName"); $pdf->AddPage(); //Second table: specify 3 columns $pdf->AddCol('rank',20,'','C'); $pdf->AddCol('name',20,'tablecountry'); $pdf->AddCol('pop',20,'Pop (2001)','R'); $prop=array('HeaderColor'=>array(255,150,100), 'color1'=>array(210,245,255), 'color2'=>array(255,255,210), 'padding'=>2); //$pdf->Table('select dbCountry,dbCountryId from tablecountry limit 0,10',$prop); $pdf->Output(); ?> How to limit the number of datas in a page

    Read the article

  • Setting minimum size limit for a window in java swing

    - by shadyabhi
    I have a JFrame which has 3 JPanels in GridBagLayout.. Now, when I minimize a windows, after a certain limit, the third JPanel tends to disappear. I tried setting minimizing size of JFrame using setMinimumSize(new Dimension(int,int)) but no success. The windows can still be minimized. So, I actually want to make a threshhold, that my window cannot be minimized after a certain limit. How can I do so? Code:- import java.awt.Dimension; import javax.swing.JFrame; public class JFrameExample { public static void main(String[] args) { JFrame frame = new JFrame("Hello World"); frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); frame.setMinimumSize(new Dimension(400, 400)); frame.setVisible(true); } } Also: shadyabhi@shadyabhi-desktop:~/java$ java --showversion java version "1.5.0" gij (GNU libgcj) version 4.4.1 Copyright (C) 2007 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. Usage: gij [OPTION] ... CLASS [ARGS] ... to invoke CLASS.main, or gij -jar [OPTION] ... JARFILE [ARGS] ... to execute a jar file Try `gij --help' for more information. shadyabhi@shadyabhi-desktop:~/java$ Gives me output like

    Read the article

  • MySQL query optimization - distinct, order by and limit

    - by Manuel Darveau
    I am trying to optimize the following query: select distinct this_.id as y0_ from Rental this_ left outer join RentalRequest rentalrequ1_ on this_.id=rentalrequ1_.rental_id left outer join RentalSegment rentalsegm2_ on rentalrequ1_.id=rentalsegm2_.rentalRequest_id where this_.DTYPE='B' and this_.id<=1848978 and this_.billingStatus=1 and rentalsegm2_.endDate between 1273631699529 and 1274927699529 order by rentalsegm2_.id asc limit 0, 100; This query is done multiple time in a row for paginated processing of records (with a different limit each time). It returns the ids I need in the processing. My problem is that this query take more than 3 seconds. I have about 2 million rows in each of the three tables. Explain gives: +----+-------------+--------------+--------+-----------------------------------------------------+---------------+---------+--------------------------------------------+--------+----------------------------------------------+ | id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra | +----+-------------+--------------+--------+-----------------------------------------------------+---------------+---------+--------------------------------------------+--------+----------------------------------------------+ | 1 | SIMPLE | rentalsegm2_ | range | index_endDate,fk_rentalRequest_id_BikeRentalSegment | index_endDate | 9 | NULL | 449904 | Using where; Using temporary; Using filesort | | 1 | SIMPLE | rentalrequ1_ | eq_ref | PRIMARY,fk_rental_id_BikeRentalRequest | PRIMARY | 8 | solscsm_main.rentalsegm2_.rentalRequest_id | 1 | Using where | | 1 | SIMPLE | this_ | eq_ref | PRIMARY,index_billingStatus | PRIMARY | 8 | solscsm_main.rentalrequ1_.rental_id | 1 | Using where | +----+-------------+--------------+--------+-----------------------------------------------------+---------------+---------+--------------------------------------------+--------+----------------------------------------------+ I tried to remove the distinct and the query ran three times faster. explain without the query gives: +----+-------------+--------------+--------+-----------------------------------------------------+---------------+---------+--------------------------------------------+--------+-----------------------------+ | id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra | +----+-------------+--------------+--------+-----------------------------------------------------+---------------+---------+--------------------------------------------+--------+-----------------------------+ | 1 | SIMPLE | rentalsegm2_ | range | index_endDate,fk_rentalRequest_id_BikeRentalSegment | index_endDate | 9 | NULL | 451972 | Using where; Using filesort | | 1 | SIMPLE | rentalrequ1_ | eq_ref | PRIMARY,fk_rental_id_BikeRentalRequest | PRIMARY | 8 | solscsm_main.rentalsegm2_.rentalRequest_id | 1 | Using where | | 1 | SIMPLE | this_ | eq_ref | PRIMARY,index_billingStatus | PRIMARY | 8 | solscsm_main.rentalrequ1_.rental_id | 1 | Using where | +----+-------------+--------------+--------+-----------------------------------------------------+---------------+---------+--------------------------------------------+--------+-----------------------------+ As you can see, the Using temporary is added when using distinct. I already have an index on all fields used in the where clause. Is there anything I can do to optimize this query? Thank you very much!

    Read the article

  • GoogleAuthUtil: Daily Limit for Unauthenticated Use Exceeded

    - by Copa
    I am using the Google Client API and the GoogleAuthUtil.class to get access to the user's Google Drive Account. String scope = "oauth2:" + DriveScopes.DRIVE; String token = GoogleAuthUtil.getToken(getContext(), account.name, scope); This is the whole magic. It worked the whole day but since a couple of hours I receive the following message when sending API calls: com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden { "code": 403, "errors": [ { "domain": "usageLimits", "message": "Daily Limit for Unauthenticated Use Exceeded. Continued use requires signup.", "reason": "dailyLimitExceededUnreg", "extendedHelp": "https://code.google.com/apis/console" } ], "message": "Daily Limit for Unauthenticated Use Exceeded. Continued use requires signup." } I dont know how to use a API Key from the console instead of an oauth2 authentication. There are two different "getToken()" messages. One has four parameters and the description for the last one says: extras: Bundle containing additional information that may be relevant to the authentication scope. But what do these information should look like? What informations do I have to put in the Bundle?

    Read the article

  • MYSQL: Limit Word Length for MySql Insert

    - by elmaso
    Hi, every search query is saved in my database, but I want to Limit the Chracterlength for one single word: odisafuoiwerjsdkle -- length too much -- dont write in the database my actually code is: $search = $_GET['q']; if (!($sql = mysql_query ('' . 'SELECT * FROM `history` WHERE `Query`=\'' . $search . '\''))) { exit ('<b>SQL ERROR:</b> 102, Cannot write history.'); ; } while ($row = mysql_fetch_array ($sql)) { $ID = '' . $row['ID']; } if ($ID == '') { mysql_query ('' . 'INSERT INTO history (Query) values (\'' . $search . '\')'); } if (!($sql = mysql_query ('SELECT * FROM `history` ORDER BY `ID` ASC LIMIT 1'))) { exit ('<b>SQL ERROR:</b> 102, Cannot write history.'); ; } while ($row = mysql_fetch_array ($sql)) { $first_id = '' . $row['ID']; } if (!($sql = mysql_query ('SELECT * FROM `history`'))) { exit ('<b>SQL ERROR:</b> 102, Cannot write history.'); ; }

    Read the article

  • XPath: limit scope of result set

    - by Laramie
    Given the XML <a> <c> <b id="1" value="noob"/> </c> <b id="2" value="tube"/> <a> <c> <b id="3" value="foo"/> </c> <b id="4" value="goo"/> <b id="5" value="noob"/> <a> <b id="6" value="near"/> <b id="7" value="bar"/> </a> </a> </a> and the Xpath 1.0 query //b[@id=2]/ancestor::a[1]//b[@value="noob"] is there some way to limit the result set to the <b> elements that are ONLY the children of the immediate <a> element of the start node (//b[@id=2])? For example the Xpath above returns both node ids 1 and 5. The goal is to limit the result to just node id=1 since it is the only @value="noob" element in the same <c> group as our start node (//b[@id=2]). In English, "Starting at a node whose id is equal to 2, find all the elements whose value is "noob" that are descendants of the immediate parent c element without passing through another c element".

    Read the article

  • Is there any danger in committing to a component library such as SmartGwt or Swing?

    - by Banang
    Since February this year I have been working on an app that's built using SmartGWT components. Generally, I find the components very nice to work with, and the fact that they're open source and free to use is just fantastic. However, I can't seem to shake the feeling that it's not a durable way of developing, but I can't quite explain why. Maybe it's because I know that any minute now the team developing it could decide to stop, which would leave me and my team in a bit of a pickle, but I'm sure there must be something more. I have been trying to find ways of explaining this feeling to myself, but to no avail. Therefore I turn to you, dear community, to ask if you can come up with a good reason why committing to building your app (that's supposed to be around for many more years to come) using a component library such as SmartGWT is a bad idea? Is there any reason I should just have developed the components myself? Or did I make the right choice when deciding not to reinvent the wheel and just go for what was readily available?

    Read the article

  • How to prevent users from inputting letters or numbers ?

    - by ZaZu
    Hello, I have a simple problem; Here is the code : #include<stdio.h> main(){ int input; printf("Choose a numeric value"); scanf("%d",&input); } I want the user to only enter numbers ... So it has to be something like this : #include<stdio.h> main(){ int input; printf("Choose a numeric value"); do{ scanf("%d",&input); }while(input!= 'something'); } My problem is that I dont know what to replace in 'something' ... How can I prevent users from inputting alphabetic characters ? Thanks for your help ! }

    Read the article

  • Tomcat - How to limit the maximum memory Tomcat will use

    - by gav
    Hi Guys, I am running Tomcat on a small VPS (256MB/512MB) and I want to explicitly limit the amount of memory Tomcat uses. I understand that I can configure this somehow by passing in the java maximum heap and initial heap size arguments; -Xmx256m -Xms128m But I can't find where to put this in the configuration of Tomcat 6 on Ubuntu. Thanks in advance, Gav

    Read the article

  • Web Part Error: This page has exceeded its data fetch limit for connected Web Parts

    - by user348515
    Hi Experts, I have a display form with two custom list forms and both are connected to each other and they display the results according to the filter. But when ever I sort on any field, it gives the following error: Web Part Error: This page has exceeded its data fetch limit for connected Web Parts. Try disconnecting one or more Web Parts to correct the problem. I appreciate any help. Thanks SP

    Read the article

  • FCKeditor for ASP.NET: file upload size limit

    - by balanza
    Hi all, I'm working on an asp.net website written in vb. I embedded fdkeditor in my page, and it works fine. As it includes image-upload feature, which works fine as well, I need to limit the size of the file before it's been uploaded. I wonder I couldn't find anyting satisfactory on web, it seems fckeditor's developers haven't ever thought about that. Has anyone workarounded it? Thanks

    Read the article

< Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >