Search Results

Search found 6172 results on 247 pages for 'limit choices to'.

Page 15/247 | < Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >

  • Limit scope of #define

    - by Ujjwal Singh
    What is the correct strategy to limit the scope of #define and avoid unwarrented token collisions. In the following configuration: Main.c # include "Utility_1.h" # include "Utility_2.h" VOID Utility() // Was written without knowing of: Utility_1 & Utility_2 { const UINT ZERO = 0; } VOID Main() { ... } // Collision; for Line:5 Compiler does not indicate what replaced Utility_1.h # define ZERO "Zero" # define ONE "One" BOOL Utility_1(); Utility_2.h # define ZERO '0' # define ONE '1' BOOL Utility_2(); Utility_2.c # include "Utility_2.h" BOOL Utility_2() { // Using: ZERO & ONE } //Collision: Character Literal replaced with String {Edit} Note: This is supposed to be a generic quesition so do not limit yourself to enum or other defined types. i.e. What to do when: I MUST USE #define Please comment on my proposed solution below.. _

    Read the article

  • Most efficent way to limit rows returns from union query- TSQL

    - by stephen776
    Hey guys...I have a simple stored proc with two queries joined with a union select name as 'result' from product where... union select productNum as 'result' from product where... I want to limit this to the TOP 10 results... if i put TOP 10 in each seperate query I get 20 results total. What is the most efficient way to limit total results to 10? I dont want to do TOP 5 in each because I may end up in a situation where I have something like 7 "names" and 3 "productsNumbers"

    Read the article

  • Mysql Limit column value repetition N times

    - by Paper-bat
    Hi at all, is my first question here, so be patient ^^ I'll go directly to problem, I have two table Customer (idCustomer, ecc.. ecc..) Comment (idCustomer, idComment, ecc.. ecc..) obviosly the two table are joined togheter, for example SELECT * FROM Comment AS co JOIN Customer AS cu ON cu.idCustomer = co.idCustomer With this I select all comment from that table associated with is Customer, but now I wanna limit the number of Comment by 2 max Comment per Customer. The first thing I see is to use 'GROUP BY cu.idCustomer' but it limit only 1 Comment per Customer, but I wanna 2 Comment per Customer.. how now to proceed?

    Read the article

  • How to set per-script upload_max_filesize limit?

    - by Ilya Muromets
    I have the following issue: How to set per-script upload_max_filesize limit? I want to set a larger upload max filesize limit for a particular script (action). I know I could use .htaccess, but the problem is that all actions are being dispatched via single index.php file, which likely means for all actions .htaccess rules are the same. So is it possible to set a per-script max_upload_size for a single php script (action)? Hope, you will be able to help me. Thanks in advance.

    Read the article

  • C++ Char without Limit

    - by Lienau
    I'm pretty well versed in C#, but I decided it would be a good idea to learn C++ as well. The only thing I can't figure out is chars. I know you can use the string lib but I also want to figure out chars. I know you can set a char with a limit like this: #include <iostream> using namespace std; int main() { char c[128] = "limited to 128"; cout << c << endl; system("pause"); return 0; } But how would I make a char without a limit? I've seen chars with * but I though that was for pointers. Any help is greatly appreciated.

    Read the article

  • Rails - set POST request limit (file upload)

    - by Fabiano PS
    I am building a file uploader for Rails, using CarrierWave. I am pretty happy about it's API, except that I don't seem to be able to cut file uploads that exceed a limit on the fly. I found this plugin for validation, but the problem is that it happens after the upload is completed. It is completely unacceptable in my case, as any user could take the site down by uploading a huge file. So, I figure that the way would be to use some Rack configuration or middleware that will limit POST body size as it receives. I am hosting on Heroku, as context. *I am aware of https://github.com/dwilkie/carrierwave_direct but it doesn't solve my issue as I have to resize first and discard the original large image.

    Read the article

  • How do you set rate limit access to your API using Iptables?

    - by Cory
    How can you set rate limit access to API using Iptables. Tried to set limit using port 80, but I don't want to set limit to the web access entirely. Is there a way to specified a subdomain rather than port. Example: set rate limit to api.example.com not example.com? If there is no way to set rate limit by subdomain, what is the suggested rate limit access to port 80 without risking blocking a legitimate web user? One connection per second would be enough?

    Read the article

  • SELECT a list of elements and 5 tags for each one

    - by Vittorio Vittori
    Hi, I'm trying to query a set of buldings listed on a table, these buildings are linked with tags. I'm able to do it, but my problem is how limit the number of tags to see: table buildings id building_name style 1 Pompidou bla 2 Alcatraz bla 3 etc. etc. table tags // they can be 50 or more per building id tag_name 1 minimal 2 gothic 3 classical 4 modern 5 etc. table buildings_tags id building_id tag_id I though to do something like this to retrieve the list, but this isn't compplete: SELECT DISTINCT(tag), bulding_name FROM buldings INNER JOIN buildings_tags ON buildings.id = buildings_tags.building_id INNER JOIN tags ON tags.id = buildings_tags.tag_id LIMIT 0, 20 // result building tag Pompidou great Pompidou france Pompidou paris Pompidou industrial Pompidou renzo piano <= How to stop at the 5th result? Pompidou hi-tech Pompidou famous place Pompidou wtf etc.. etc... this query loads the buildings, but this query loads all the tags linked for the building, and not only 5 of them?

    Read the article

  • Minimum 4 Characters or Show Nothing - PHP

    - by ali
    Hi, I want to show up only words with minimum 4 Characters, actually I can limit the hits and show only words with min. 100 hits - but how can I set a min. character length? thank you! function sm_list_recent_searches($before = '', $after = '', $count = 20) { // List the most recent successful searches. global $wpdb, $table_prefix; $count = intval($count); $results = $wpdb->get_results( "SELECT `terms`, `datetime` FROM `{$table_prefix}searchmeter_recent` WHERE 100 < `hits` ORDER BY `datetime` DESC LIMIT $count"); if (count($results)) { foreach ($results as $result) { echo '<a href="'. get_settings('home') . '/search/' . urlencode($result->terms) . '">'. htmlspecialchars($result->terms) .'</a>'.", "; } } }

    Read the article

  • PHP - How to display other values, when a query is limited by 3?

    - by Dodi300
    Hello. Can anyone tell me how to display the other values, when a query is limited my 3. In this question I asked how to order and limit values, but now I want to show the others in another query. How would I go about doing this? Here's the code I used before: $query = "SELECT gmd FROM account ORDER BY gmd DESC LIMIT 3"; $result = mysql_query($query); while($row = mysql_fetch_array($result, MYSQL_ASSOC)) { } Thanks!

    Read the article

  • MySQL Query to receive random combinations from two tables.

    - by Michael
    Alright, here is my issue, I have two tables, one named firstnames and the other named lastnames. What I am trying to do here is to find 100 of the possible combinations from these names for test data. The firstnames table has 5494 entries in a single column, and the lastnames table has 88799 entries in a single column. The only query that I have been able to come up with that has some results is: select * from (select * from firstnames order by rand()) f LEFT JOIN (select * from lastnames order by rand()) l on 1=1 limit 10; The problem with this code is that it selects 1 firstname and gives every lastname that could go with it. While this is plausible, I will have to set the limit to 500000000 in order to get all the combinations possible without having only 20 first names(and I'd rather not kill my server). However, I only need 100 random generations of entries for test data, and I will not be able to get that with this code. Can anyone please give me any advice?

    Read the article

  • php-cgi memory usage higher than php's memory limit

    - by Josh Nankin
    I'm running apache with a worker MPM and php with fastcgi. the following are my mpm limits: StartServers 5 MinSpareThreads 5 MaxSpareThreads 10 ThreadLimit 64 ThreadsPerChild 10 MaxClients 10 MaxRequestsPerChild 2000 I've also setup my php-cgi with the following: PHP_FCGI_CHILDREN=5 PHP_FCGI_MAX_REQUESTS=500 I'm noticing that my average php-cgi process is using around 200+mb of RAM, even as soon as they are started. However, my php memory_limit is only 128M. How is this possible, and what can I do to lower the php-cgi memory consumption?

    Read the article

  • CopSSH SFTP -- limit users access to their home directory only

    - by bradvido
    Let me preface this by saying I've read and followed these instructions at the FAQ many times: http://www.itefix.no/i2/node/37 It does not do what the title claims... It allows every user access to every other user's home directory, as well as access to all subfolders below the copssh installation path. I'm only using this for SFTP access and I need my users to be sandboxed into only their home directory. If you know a fool-proof way to lock users down so they can see only their home directory and its subfolders, stop reading now and reply with the solution. The details: Here is exactly what i tried as I followed the FAQ. My copSSH installation directory is: C:\Program Files\CopSSH net localgroup sftp_users /ADD **Create a user group to hold all my SFTP users cacls c:\ /c /e /t /d sftp_users **For that group, deny access at the top level and all levels below cacls "C:\Program Files\CopSSH" /c /e /t /r sftp_users **Allow my user group access to the copSSH installation directory and its subdirectories For each sftp user, I create a new windows user account, then I: net localgroup sftp_users sftp_user_1 /add **Add my user to the group I've created Open the activate user wizard for CopSSH, choosing the user, "/bin/sftponly" and Remove copssh home directory if it exists **Remains checked Create keys for public key authentication **Remains checked Create link to user's real home directory **Remains checked This works, however, every user has access to every other user's home directory as well as the CopSSH root directory.... So I tried denying access for all users to the user home directory: cacls "C:\Program Files\CopSSH\home" /c /e /t /d sftp_users **Deny access for users to the user home directory Then I tried adding permissions on a user-by-user basis for each users home\username folder. However,these permission were not allowed by windows because of the above deny rule i created at the home directory was being inherited and over-riding my allow rule. The next step for me would be to remove the deny rule at the home directory and for each user folder, add a deny rule for every user it doesn't belong to, and add an allow rule for the one user it does belong to. However, as my user list gets long, this will become very cumbersome. Thanks for the help!

    Read the article

  • Windows 7 / ATI CCC - Monitor limit?

    - by james.ingham
    Hi all, I have a Ati 4850 X2 and back in the days when Windows 7 was in beta, I had this running with 4 monitors (Crossfire off). Now I have come to set this up again, with a RTM version of Windows 7 and the latest ATI drivers and it seems I can only have two monitors on at one time. If I try and set the monitors up in Control Panel, it simply disables the non-primary monitor, and if I attempt to do it in CCC, I get the message "To extend the desktop, a desktop or display must be disabled." Does anyone know why this is and/or how to fix it? I've currently tried using the latest ATI drivers and the drivers (i think) I was using over a year ago when I had this setup. Thanks - James

    Read the article

  • How to change Blackberry initial message download limit.

    - by Mikey B
    BES 4.1.6 Blackberry 8300 (curve) Hi guys, I've noticed that handhelds will typically only retrieve the first few KB and then prompt the user to manually retrieve more (or auto-retrieve if they scroll down). The problem is that I have a BB app that needs to see the entire message all at once on the first initial time it's opened. Is there a setting on BES that will allow me to change how much data a handheld initially retrieves per message? Thanks, M

    Read the article

  • Limit Apache 2 Memory Usage

    - by UltraNurd
    I am running a hobby webserver off of an ancient Blue & White G3/300 running Debian PPC Squeeze 2.6.30. The performance is okay for a while after a restart, but it eventually gets more and more bogged down. Right now it's at 76 days uptime, and the main culprit seems to be the memory usage of 10+ apache2 processes. I think I need to lower the values for StartServers, MinSpareServers, and/or MaxSpareServers, but I'm not sure which one to adjust, and there are three sections for each depending on which mpm module is in use. How do I tell which of the following sections I need to change, and what are some reasonable values given that the box has 448 MB physical memory (weird upgrade history of one each 64, 128, and 256 sticks)? <IfModule mpm_prefork_module> StartServers 5 MinSpareServers 5 MaxSpareServers 10 MaxClients 150 MaxRequestsPerChild 0 </IfModule> <IfModule mpm_worker_module> StartServers 2 MinSpareThreads 25 MaxSpareThreads 75 ThreadLimit 64 ThreadsPerChild 25 MaxClients 150 MaxRequestsPerChild 0 </IfModule> <IfModule mpm_event_module> StartServers 2 MaxClients 150 MinSpareThreads 25 MaxSpareThreads 75 ThreadLimit 64 ThreadsPerChild 25 MaxRequestsPerChild 0 </IfModule> There aren't any other instances of StartServers in my apache2.conf, but none of those mpm modules appear in mods-available or mods-enabled. Ideas? Thanks!

    Read the article

  • How do I limit the CPU share of TrustedInstaller.exe on a Vista system

    - by Dan Neely
    I'm trying to fix a few low end single core desktops running Vista. In normal use they're fast enough not to be a problem. The issue is that because these machines are only on when being used, primarily for school work, windowsupdate begins installing patches it launches TrustedInstaller which in turn hogs 100% of the CPU and renders the machines all but unusable for however long it takes to patch them.

    Read the article

  • Limit vsftp upload to a given set of file-names

    - by Chen Levy
    I need to configure an anonymous ftp with upload. Given this requirement I try to lock this server down to the bear minimum. One of the restrictions I wish to impose is to enable the upload of only a given set of file-names. I tried to disallow write permission to the upload folder, and put in it some empty files with write permission: /var/ftp/ [root.root] [drwxr-xr-x] |-- upload/ [root.root] [drwxr-xr-x] | |-- upfile1 [ftp.ftp] [--w-------] | `-- upfile2 [ftp.ftp] [--w-------] `-- download/ [root.root] [drwxr-xr-x] `-- ... But this approach didn't work because when I tried to upload upfile1, it tried to delete and create a new file in its' place, and there is no permissions for that. Is there a way to make this work, or perhaps use a different approach like abusing the deny_file option?

    Read the article

  • Setting a time limit for a transaction in MySQL/InnoDB

    - by Trevor Burnham
    This sprang from this related question, where I wanted to know how to force two transactions to occur sequentially in a trivial case (where both are operating on only a single row). I got an answer—use SELECT ... FOR UPDATE as the first line of both transactions—but this leads to a problem: If the first transaction is never committed or rolled back, then the second transaction will be blocked indefinitely. The innodb_lock_wait_timeout variable sets the number of seconds after which the client trying to make the second transaction would be told "Sorry, try again"... but as far as I can tell, they'd be trying again until the next server reboot. So: Surely there must be a way to force a ROLLBACK if a transaction is taking forever? Must I resort to using a daemon to kill such transactions, and if so, what would such a daemon look like? If a connection is killed by wait_timeout or interactive_timeout mid-transaction, is the transaction rolled back? Is there a way to test this from the console? Clarification: innodb_lock_wait_timeout sets the number of seconds that a transaction will wait for a lock to be released before giving up; what I want is a way of forcing a lock to be released. Update: Here's a simple example that demonstrates why innodb_lock_wait_timeout is not sufficient to ensure that the second transaction is not blocked by the first: START TRANSACTION; SELECT SLEEP(55); COMMIT; With the default setting of innodb_lock_wait_timeout = 50, this transaction completes without errors after 55 seconds. And if you add an UPDATE before the SLEEP line, then initiate a second transaction from another client that tries to SELECT ... FOR UPDATE the same row, it's the second transaction that times out, not the one that fell asleep. What I'm looking for is a way to force an end to this transaction's restful slumber.

    Read the article

  • php-cgi memory usage higher than php's memory limit

    - by Josh Nankin
    I'm running apache with a worker MPM and php with fastcgi. the following are my mpm limits: StartServers 5 MinSpareThreads 5 MaxSpareThreads 10 ThreadLimit 64 ThreadsPerChild 10 MaxClients 10 MaxRequestsPerChild 2000 I've also setup my php-cgi with the following: PHP_FCGI_CHILDREN=5 PHP_FCGI_MAX_REQUESTS=500 I'm noticing that my average php-cgi process is using around 200+mb of RAM, even as soon as they are started. However, my php memory_limit is only 128M. How is this possible, and what can I do to lower the php-cgi memory consumption?

    Read the article

  • CopSSH SFTP -- limit users access to their home directory only

    - by bradvido
    Let me preface this by saying I've read and followed these instructions at the FAQ many times: http://www.itefix.no/i2/node/37 It does not do what the title claims... It allows every user access to every other user's home directory, as well as access to all subfolders below the copssh installation path. I'm only using this for SFTP access and I need my users to be sandboxed into only their home directory. If you know a fool-proof way to lock users down so they can see only their home directory and its subfolders, stop reading now and reply with the solution. The details: Here is exactly what i tried as I followed the FAQ. My copSSH installation directory is: C:\Program Files\CopSSH net localgroup sftp_users /ADD **Create a user group to hold all my SFTP users cacls c:\ /c /e /t /d sftp_users **For that group, deny access at the top level and all levels below cacls "C:\Program Files\CopSSH" /c /e /t /r sftp_users **Allow my user group access to the copSSH installation directory and its subdirectories For each sftp user, I create a new windows user account, then I: net localgroup sftp_users sftp_user_1 /add **Add my user to the group I've created Open the activate user wizard for CopSSH, choosing the user, "/bin/sftponly" and Remove copssh home directory if it exists **Remains checked Create keys for public key authentication **Remains checked Create link to user's real home directory **Remains checked This works, however, every user has access to every other user's home directory as well as the CopSSH root directory.... So I tried denying access for all users to the user home directory: cacls "C:\Program Files\CopSSH\home" /c /e /t /d sftp_users **Deny access for users to the user home directory Then I tried adding permissions on a user-by-user basis for each users home\username folder. However,these permission were not allowed by windows because of the above deny rule i created at the home directory was being inherited and over-riding my allow rule. The next step for me would be to remove the deny rule at the home directory and for each user folder, add a deny rule for every user it doesn't belong to, and add an allow rule for the one user it does belong to. However, as my user list gets long, this will become very cumbersome. Thanks for the help!

    Read the article

  • How can I exceed the 60% Memory Limit of IIS7

    - by evilknot
    Pardon if this is more stackoverflow vs. serverfault. It seems to be on the border. We have an application that caches a large amount of product data for an e-commerce application using ASP.NET caching. This is a dictionary object with 65K elements, and our calculations put the object's size at ~10GB. Problem: The amount of memory the object consumes seems to be far in excess of our 10GB calculation. BIGGEST CONCERN: We can't seem to use over 60% of the 32GB in the server. What we've tried so far: In machine.config/system.web (sf doesn't allow the tags, pardon the formatting): processModel autoConfig="true" memoryLimit="80" In web.config/system.web/caching/cache (sf doesn't allow the tags, pardon the formatting): privateBytesLimit = "20000000000" (and 0, the default of course) percentagePhysicalMemoryUsedLimit = "90" Environment: Windows 2008R2 x64 32GB RAM IIS7 Nothing seems to allow us to exceed the 60% value. See attached screenshot of taskman.

    Read the article

  • postfix concurrency limit with round robin dns

    - by goose
    Take the following internal round robin dns setup mymta.com. IN A 172.31.1.1 mymta.com. IN A 172.31.1.2 mymta.com. IN A 172.31.1.3 mymta.com. IN A 172.31.1.4 mymta.com. IN A 172.31.1.5 mymta.com. IN A 172.31.1.6 mymta.com. IN A 172.31.1.7 mymta.com. IN A 172.31.1.8 mymta.com. IN A 172.31.1.9 mymta.com. IN A 172.31.1.10 Now assume the following postfix setup (assume these are the only tweaks from defaults in debian package) main.cf: smtp_connection_cache_destinations = mymta.com smtp_connection_cache_reuse_limit = 750 smtp_destination_concurrency_limit = 75 transport * :[mymta.com] I would expect 75 concurrent connections spread across the 10 A records I've set in DNS. However I'm seeing more than a few hundred connections to mymta.com and I'm wondering if Postfix is "smart" enough to set up 75 concurrent connections for each IP address. Thoughts?

    Read the article

  • Limit vsftpd upload to a given set of file-names

    - by Chen Levy
    I need to configure an anonymous ftp with upload. Given this requirement I try to lock this server down to the bear minimum. One of the restrictions I wish to impose is to enable the upload of only a given set of file-names. I tried to disallow write permission to the upload folder, and put in it some empty files with write permission: /var/ftp/ [root.root] [drwxr-xr-x] |-- upload/ [root.root] [drwxr-xr-x] | |-- upfile1 [ftp.ftp] [--w-------] | `-- upfile2 [ftp.ftp] [--w-------] `-- download/ [root.root] [drwxr-xr-x] `-- ... But this approach didn't work because when I tried to upload upfile1, it tried to delete and create a new file in its' place, and there is no permissions for that. Is there a way to make this work, or perhaps use a different approach like abusing the deny_file option?

    Read the article

< Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >