Search Results

Search found 6311 results on 253 pages for 'limit clause'.

Page 181/253 | < Previous Page | 177 178 179 180 181 182 183 184 185 186 187 188  | Next Page >

  • Does Postgresql varchar count using unicode character length or ASCII character length?

    - by bennylope
    I tried importing a database dump from a SQL file and the insert failed when inserting the string Mér into a field defined as varying(3). I didn't capture the exact error, but it pointed to that specific value with the constraint of varying(3). Given that I considered this unimportant to what I was doing at the time, I just changed the value to Mer, it worked, and I moved on. Is a varying field with its limit taking into account length of the byte string? What really boggles my mind is that this was dumped from another PostgreSQL database. So it doesn't make sense how a constraint could allow the value to be written initially.

    Read the article

  • Codesample with bufferoverflow (gets method). Why does it not behave as expected?

    - by citronas
    This an extract from an c program that should demonstrate a bufferoverflow. void foo() { char arr[8]; printf(" enter bla bla bla"); gets(arr); printf(" you entered %s\n", arr); } The question was "How many input chars can a user maximal enter without a creating a buffer overflow" My initial answer was 8, because the char-array is 8 bytes long. Although I was pretty certain my answer was correct, I tried a higher amount of chars, and found that the limit of chars that I can enter, before I get a segmentation fault is 11. (Im running this on A VirtualBox Ubuntu) So my question is: Why is it possible to enter 11 chars into that 8 byte array?

    Read the article

  • Retrive multiple values from single dimension value / key JSON

    - by jonnypixel
    I'm busting my head trying to work this out. "ContentBlock1":["2","22"] I have been trying to get the 2 and the 22 into a comma sepertaed string so i can use it within a MySQL IN(2,22) query. I currently have tried several ways but none seem to work for me. $ContentBlock = my json data; $cid = json_decode($ContentBlock,true); foreach ($cid as $key){ $jsoncid = "$key ,"; } And then: SELECT * FROM content WHERE featured=1 AND state=1 AND catid IN($jsoncid) ORDER BY ordering ASC LIMIT 4");

    Read the article

  • Best place to store large amounts of session data

    - by audiopleb
    I'm building an application that needs to store and re-use large amounts of data per session. So for example, the user selects a large list of list items (say 2000 or significantly more) which have a numeric value as their key then they save that selection and go off to another page, do something else and then come back to the original page and need to load their selections into that page. What is the quickest and most efficient way of storing and reusing that data? In a text file saved with the session id? In a temp db table? In the session data itself (db sessions so size isn't a limit) using a serialised string or using gzcompress or gzencode? Any advice or insight would be great! Thank you!!!!

    Read the article

  • MySQL command-line tool: How to find out number of rows affected by a DELETE?

    - by ambivalence
    I'm trying to run a script that deletes a bunch of rows in a MySQL (innodb) table in batches, by executing the following in a loop: mysql --user=MyUser --password=MyPassword MyDatabase < SQL_FILE where SQL_FILE contains a DELETE FROM ... LIMIT X command. I need to keep running this loop until there's no more matching rows. But unlike running in the mysql shell, the above command does not return the number of rows affected. I've tried -v and -t but neither works. How can I find out how many rows the batch script affected? Thanks!

    Read the article

  • How do you automatically refresh part of a page automatically using Javascript or AJAX?

    - by Ryan
    $messages = $db->query("SELECT * FROM chatmessages ORDER BY datetime DESC, displayorderid DESC LIMIT 0,10"); while($message = $db->fetch_array($messages)) { $oldmessages[] = $message['message']; } $oldmessages = array_reverse($oldmessages); ?> <div id="chat"> <?php for ($count = 0; $count < 9; $count++) { echo $oldmessages[$count]; } ?> <script language="javascript" type="text/javascript"> <!-- setInterval( "document.getElementById('chat').innerHTML='<NEW CONTENT OF #CHAT>'", 1000 ); --> </script> </div> I'm trying to create a PHP chatroom script but I'm having a lot of trouble getting it to AutoRefresh The content should automatically update to , how do you make it do that? I've been searching for almost an hour

    Read the article

  • Sub Query making Query slow.

    - by Muhammad Kashif Nadeem
    Please copy and paste following script. DECLARE @MainTable TABLE(MainTablePkId int) INSERT INTO @MainTable SELECT 1 INSERT INTO @MainTable SELECT 2 DECLARE @SomeTable TABLE(SomeIdPk int, MainTablePkId int, ViewedTime1 datetime) INSERT INTO @SomeTable SELECT 1, 1, DATEADD(dd, -10, getdate()) INSERT INTO @SomeTable SELECT 2, 1, DATEADD(dd, -9, getdate()) INSERT INTO @SomeTable SELECT 3, 2, DATEADD(dd, -6, getdate()) DECLARE @SomeTableDetail TABLE(DetailIdPk int, SomeIdPk int, Viewed INT, ViewedTimeDetail datetime) INSERT INTO @SomeTableDetail SELECT 1, 1, 1, DATEADD(dd, -7, getdate()) INSERT INTO @SomeTableDetail SELECT 2, 2, NULL, DATEADD(dd, -6, getdate()) INSERT INTO @SomeTableDetail SELECT 3, 2, 2, DATEADD(dd, -8, getdate()) INSERT INTO @SomeTableDetail SELECT 4, 3, 1, DATEADD(dd, -6, getdate()) SELECT m.MainTablePkId, (SELECT COUNT(Viewed) FROM @SomeTableDetail), (SELECT TOP 1 s2.ViewedTimeDetail FROM @SomeTableDetail s2 INNER JOIN @SomeTable s1 ON s2.SomeIdPk = s1.SomeIdPk WHERE s1.MainTablePkId = m.MainTablePkId) FROM @MainTable m Above given script is just sample. I have long list of columns in SELECT and around 12+ columns in Sub Query. In my From clause there are around 8 tables. To fetch 2000 records full query take 21 seconds and if I remove Subquiries it just take 4 seconds. I have tried to optimize query using 'Database Engine Tuning Advisor' and on adding new advised indexes and statistics but these changes make query time even bad. Note: As I have mentioned that this is test data to explain my question the real data has lot of tables joins columns but without Sub-Query the results us fine. Any help thanks.

    Read the article

  • Returning more than 1000 rows in classic asp adodb.recordset

    - by peg_leg
    My code in asp classic, doing a mssql database query: rs.pagesize = 1000 ' this should enable paging rs.maxrecords = 0 ' 0 = unlimited maxrecords response.write "hello world 1<br>" rs.open strSql, conn response.write "hello world 2<br>" My output when there are fewer than 1000 rows returned is good. More than 1000 rows and I don't get the "hello world 2". I thought that setting pagesize sets up paging and thus allows all rows to be returned regardless of how many rows there are. Without setting pagesize, paging is not enable and the limit is 1000 rows. However my page is acting as if pagesize is not working at all. Please advise.

    Read the article

  • Migrating from mssql to firebird: pro and cons

    - by user193655
    i am considering the migration for 3 reasons: 1) SQLSERVER installation is a nightmar, expecially for 1-user software. Software installs in 10 seconds, SQLServer in 1 hour. Firebird installation is much easier. 2) SQLSERVER runs on windows server only 3) My customers have all the express edition 4) i am not using any advanced feature, I am now starting using filestream, but the main reason for this is that Express eidtion has 4/10GB db size limit So these are all Pros of moving to Firebird. Which are the cons? I can also plan to support both platiforms, but this will backfire I fear.

    Read the article

  • Google Maps API limitations

    - by Henrik Skogmo
    I am working on a project where I am going to create a map over a specific area, and have some points-of-interest included. Thinking of the ones already included in Google Maps. My first thought was that this could be done in the Google Maps API, but I've never worked with if before so therefor I have some questions about it's limitations and capability. Can I limit the map to one specific area? Can I filter the points-of-interest? Ex. only gas stations and hotels etc. Thats about it to get me started at least. Thanks!

    Read the article

  • Send data to webserver from C#, what's the most efficient way?

    - by Brian
    I am sending gps coordinates from a windows mobile phone to a webserver using a basic program I wrote in C#. The problem is the data plan on the phone only allows 4 MB per month. I was planning on updating the location every 10 seconds. Currently I am just creating a webrequest every 10 seconds to a php page on the server and the coordinates are passed over in the url, the php page saves them to the database. This generates about 1K of data per request, at this rate I will hit my data limit in less than a day. Is there a more efficient way to do this?

    Read the article

  • How to query range of data in DB2 with highest performance?

    - by Fuangwith S.
    Usually, I need to retrieve data from a table in some range; for example, a separate page for each search result. In MySQL I use LIMIT keyword but in DB2 I don't know. Now I use this query for retrieve range of data. SELECT * FROM( SELECT SMALLINT(RANK() OVER(ORDER BY NAME DESC)) AS RUNNING_NO , DATA_KEY_VALUE , SHOW_PRIORITY FROM EMPLOYEE WHERE NAME LIKE 'DEL%' ORDER BY NAME DESC FETCH FIRST 20 ROWS ONLY ) AS TMP ORDER BY TMP.RUNNING_NO ASC FETCH FIRST 10 ROWS ONLY but I know it's bad style. So, how to query for highest performance?

    Read the article

  • problem in decreasing page's queries

    - by Mac Taylor
    hey guys i have a tag table in my php/mysql project that looks like this Table name : bt_tags Table fileds : tid,tag and for every story rows there is a filed named : tags Table name: stories table filed : tags that saved in this field as ids 1 5 6 space between them now problem : when using while loop to fetch all fields in story table , the page uses 1 query to show every stories' detail but for showing tag's names , i should query another table to find names , we have ids stored in story table now i used for loop between while loop to show tag names but im sure there is a better way to decrease page queries how can i improve this script and show tag's names without using *for loop ?* $result = $db->sql_query("SELECT * FROM ".STORY_TABLE." "); while ($row = $db->sql_fetchrow($result)) { //fetching other $vars ---- $tags_id = explode(" ",$row['tags']); $c = count($tags_id); for($i=1;$i<$c-1;$i++){ list($tag_name,$slug) = $db->sql_fetchrow($db->sql_query( 'SELECT `tag`,`slug` FROM `bt_tags` WHERE `tid` = "'.tags_id[$i].'" LIMIT 1' )); $sow_tags = '$tag_name,'; }

    Read the article

  • Adding a Third Table to a Two-Table Join Query

    - by John
    Hello, The query below works just fine. It pulls fields from two MySQL tables, "comment" and "login". It does this for rows where "username" in the table "login" equals the variable "$profile." It also pulls fields for rows where "loginid" in the table "comment" equals the "loginid" that is also being pulled from "login." I would like to pull data from a third table called "submission," which has the following fields: submissionid loginid title url displayurl datesubmitted I would like to pull fields from rows in "submission" where "loginid" equals the "loginid" that is already being pulled from the other two tables, "login" and "comment." How can I do this? Thanks in advance, John Query: $sqlStrc = "SELECT l.username, l.loginid, c.loginid, c.commentid, c.submissionid, c.comment, c.datecommented FROM comment AS c INNER JOIN login AS l ON c.loginid = l.loginid WHERE l.username = '$profile' ORDER BY c.datecommented DESC LIMIT 10";

    Read the article

  • QT4, paginated showing elements

    - by matiit
    I am going to write an application that uses QT4 (with C++ or python it isnt important in that moment). One of functionality is "Showing all items in database". One item has a Title, author, description and photo (constant size) And there could be very many items. Let's say 400. There won't be enough space to show'em all at once time. One row will have 200px, so i need at most 4 for once time. How to paginate them? I have no idea. I can use limit and offset in SQL queries, but how to tell window: "that's 5th page"? Any solutions?

    Read the article

  • Efficient way to combine results of two database queries.

    - by ensnare
    I have two tables on different servers, and I'd like some help finding an efficient way to combine and match the datasets. Here's an example: From server 1, which holds our stories, I perform a query like: query = """SELECT author_id, title, text FROM stories ORDER BY timestamp_created DESC LIMIT 10 """ results = DB.getAll(query) for i in range(len(results)): #Build a string of author_ids, e.g. '1314,4134,2624,2342' But, I'd like to fetch some info about each author_id from server 2: query = """SELECT id, avatar_url FROM members WHERE id IN (%s) """ values = (uid_list) results = DB.getAll(query, values) Now I need some way to combine these two queries so I have a dict that has the story as well as avatar_url and member_id. If this data were on one server, it would be a simple join that would look like: SELECT * FROM members, stories WHERE members.id = stories.author_id But since we store the data on multiple servers, this is not possible. What is the most efficient way to do this? Thanks.

    Read the article

  • How to find/display the Upload File limits on IIS with ASP.NET?

    - by NVRAM
    I have a web service on which the end users will be uploading ZIP archives that can be very large (one test file is over 200MB). I'd like to handle oversized files proactively and size-limited upload failures gracefully. Since the web app will be deployed on customers' machines, so I cannot easily ensure that the configuration matches any fixed size. I've documented how to use the appcmd command for them to set the requestLimits.maxAllowedContentLength value beyond the 30MB default. But I'd like to handle it in the web app; I'm hoping for two things: To show the current limit on the page where they initiate the file upload, something along the lines of: Each file upload is limited to 15MB. If your archive is larger, (etc., etc., etc.) To give a meaningful error when that size is exceeded. Currently, it takes a long time for the data to be sent, and then I see a misleading 404 page. Any thoughts?

    Read the article

  • Walking through an SQLite Table

    - by galford13x
    I would like to implement or use functionality that allows stepping through a Table in SQLite. If I have a Table Products that has 100k rows, I would like to retrive perhaps 10k rows at a time. Somthing similar to how a webpage would list data and have a < Previous .. Next > link to walk through the data. Are there select statements that can make this simple? I see and have tried using the ROWID in conjunction with LIMIT which seems ok if not ordering the data. // This seems works if not ordering. SELECT * FROM Products WHERE ROWID BETWEEN x AND y;

    Read the article

  • Which type of Rails model association should I use in this situation?

    - by jstayton
    I have two models/tables in my Rails application: discussions and comments. Each discussion has_many comments, and each comment belongs_to a discussion. My discussions table also includes a first_comment_id column and last_comment_id column for convenience and speed. I want to be able to call discussion.last_comment for the last comment model, but the following (in my discussion model) isn't working to make this happen: has_one :first_comment, :class_name => "Comment" has_one :last_comment, :class_name => "Comment" When I call discussion.last_comment, the following SQL is run: SELECT * FROM `comments` WHERE (`comments`.discussion_id = 1) LIMIT 1 It's using the discussions.id column to join against comments.discussion_id, when I want it to join discussions.last_comment_id against comments.id. Am I using the wrong type of association here? Thanks for your help!

    Read the article

  • Security Exception while implementing global search for Messaging

    - by Sunil
    I am trying to enable global search for messaging application (i.e., messages can be searched from home screen search box). I have followed all the steps given in http://developer.android.com/reference/android/app/SearchManager.html I am getting the following exception 04-16 12:49:26.917: ERROR/DatabaseUtils(102): java.lang.SecurityException: Permission Denial: reading com.android.providers.telephony.MmsSmsProvider uri content://mms-sms/search_suggest_query/m?limit=58 from pid=106, uid=10000 requires android.permission.READ_SMS I have set permission in MmsSmsProvider.java file for read, write sms and global search, but still I get this error. Can anyone help. Regards, Sunil.

    Read the article

  • Input multiple file names in windows open file dialog box

    - by goodiet
    Windows 7 allows you to select multiple files to open at once by using ctrl or shift key. The "File Name" input field at the bottom of the dialog box would auto populate with the following sample: "aaa.txt" "bbb.txt" "ccc.txt" "ddd.txt" I have 14,000 files in a folder and I only need a range of files (approx 500). When I use the shift key to select a range of files, the "File Name" field auto populates all 500 file names. Windows would cut me off at the 260th character when I try to paste in a pre-generated string into the "File Name" field. Is there a way to bypass the 260 character limit so it would accept my entire string with 500 file names?

    Read the article

  • Ajax Reload same page onclick

    - by user277891
    Hi, This is my situation.I have some 10 links on a page. So when user clicks on those links ajax-same page reload must take place. To be clear I have something like this. <a href="test.php?name=one">one</a> | <a href="test.php?name=Two">Two</a> If javascript is enabled, Onclick, ajax load must take place. If javascript is disabled, Then the above should work. Basically I am using "name" to limit some values of my search page.

    Read the article

  • HttpTunneling a TCPClient application

    - by user360116
    We have a custom chat application(c#) which uses TCPClient. We are having problem on clients who are behind Firewall or proxy. We know that these client can browse the internet without a problem so we decided to change our TCPClient application so that It uses HTTP messages to communicate. Will it be enough just to wrap our text massages with standard HTML tags and HTTP headers? We need a long lasting connection. Does keep-alive have a limit? Do firewalls or proxies have time limits for "alive" connections.

    Read the article

  • Breaking change October: Status waiting when create custom audience with add user data file?

    - by THACH LN
    In my system, I create a custom audience - data file - mobile advertiser ids with 10k user. After I created it, I refreshed API link to get information about the audience which I've just created: https://graph.facebook.com/act_xxx/customaudiences?fields=id,account_id,name,lookalike_spec,retention_days,subtype,approximate_count,rule,delivery_status,operation_status,data_source,permission_for_actions&limit=500&access_token=xxx. (After breaking change in October 2014, status field not return. So I see operation_status) I see that: "operation_status": { "code": 410, "description": "No file has been uploaded for this audience, or the previous upload has failed due to system error. Please try uploading the file again." } But after about 10s, I refresh again, and operation_status change to: "operation_status": { "code": 200, "description": "Normal" } My quetions is: how to check this audience's status is waiting ( uploading file) , like when I create on facebook not my our system. Thanks for all anwser.

    Read the article

  • SSIS - Limiting Concurrent Connections

    - by Bigtoe
    Hi Folks, I am using SSIS to connect to a legecy mainframe database and this allows only 5 concurrent connections at a time. I have a dataflow task with many tables to transfer and it kicks outs because of this limitation. I have split up the Data Flow task into seperate data flows and this is working for the moment, but it is not optiomal as they need to be sequenced and 1 large transfer in a flow is holding up subsequent transfers. Anyone any idea of how to limit the number of connections in a single data flow, I had a look at using the Engine Threads but this did not make any difference. Any help much appericated.

    Read the article

< Previous Page | 177 178 179 180 181 182 183 184 185 186 187 188  | Next Page >