Search Results

Search found 37260 results on 1491 pages for 'command query responsibil'.

Page 495/1491 | < Previous Page | 491 492 493 494 495 496 497 498 499 500 501 502  | Next Page >

  • Graph API - Get events by owner/creator

    - by jwynveen
    Is there a way with the Facebook Graph API to get a list of all events created by a single profile? Our client creates a bunch of events and we want to pull a list of them all. I said that they would just have to make sure they set themselves to be attending the event, because then I can easily pull the list of events that profileId is attending, but I'm curious if there's another way. Maybe an FQL query? They look to require a query on the primary key though. And what would that FQL query look like if that's the way to do it??

    Read the article

  • NSStream sockets missing data

    - by Chris T.
    I am trying to pull some sample data from FreeDB as a proof of concept, but I am having a tough time retrieving all of the data off the incoming stream (I am only getting the last bits for the final query listed here (if handshakeCode = 3) I think this may be something with the threading on the main runloop, but I am not sure. Odd thing is when the buffer size is larger than 1-2 bytes (which works as expected), I seem to be losing access to the data programmatically (the totalOutput variable on the first set of data is incomplete). I set up a packet capture, and it looks like those 1024 bytes are coming across the wire, but the app just isn't working with it. It looks like the next event is coming through and basically taking over. I tried using an NSLock to no avail as well. If I drop the buffer size down to 1 or 2, things seem to be reading just fine. This is probably obvious to someone who does this all the time, but this is my first foray into this with something I am familiar with, technology wise in other languages / platforms. The following code will show you what is happening. Run with the buffer set to 1024, and you will see a short final string, but once you set it to 1, you will see the amount of data I was expecting (I was even expecting it to be split, so that's not a big worry) #import <Foundation/Foundation.h> #import <Cocoa/Cocoa.h> //STACK OVERFLOW CODE: @interface stackoverflow : NSObject <NSStreamDelegate> { NSInputStream *iStream; NSOutputStream *oStream; int handshakeCode; NSString *selectedDiscId; NSString *selectedGenre; } -(void)getMatchesFromFreeDB; -(void)sendToOutputStream:(NSString*)command; @end @implementation stackoverflow -(void)getMatchesFromFreeDB { NSHost *host = [NSHost hostWithName:@"freedb.freedb.org"]; [NSStream getStreamsToHost:host port:8880 inputStream:&iStream outputStream:&oStream]; [iStream retain]; [oStream retain]; [iStream setDelegate:self]; [oStream setDelegate:self]; [iStream scheduleInRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode]; [oStream scheduleInRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode]; [iStream open]; [oStream open]; handshakeCode = 0; //not done any processing } -(void)stream:(NSStream *)aStream handleEvent:(NSStreamEvent)eventCode { switch(eventCode) { case NSStreamEventOpenCompleted: { NSLog(@"Stream open completed"); break; } case NSStreamEventHasBytesAvailable: { NSLog(@"Stream has bytes available"); if (aStream == iStream) { NSMutableString *totalOutput = [NSMutableString stringWithString:@""]; //read data uint8_t buffer[1024]; int len; while ([iStream hasBytesAvailable]) { len = [iStream read:buffer maxLength:sizeof(buffer)]; if (len 0) { NSString *output = [[NSString alloc] initWithBytes:buffer length:len encoding:NSUTF8StringEncoding]; //this could have also been put into an NSData object if (nil != output) { //append to the total output [totalOutput appendString:output]; } } } NSLog(@"OUTPUT , %i:\n\n%@", [totalOutput lengthOfBytesUsingEncoding:NSUTF8StringEncoding], totalOutput); NSArray *outputComponents = [totalOutput componentsSeparatedByString:@" "]; //Attempt to get handshake code, since we haven't done it yet: if (handshakeCode == 1) { //we are just getting the sign-on banner: //let's move on: handshakeCode = 2; } else if (handshakeCode == 2) { handshakeCode = [[outputComponents objectAtIndex:0] intValue]; if (handshakeCode == 200) { NSLog(@"---Handshake OK %i", handshakeCode); NSMutableString *query = [NSMutableString stringWithString:@"cddb query f3114b11 17 225 19915 36489 54850 69425 87025 103948 123242 136075 152817 178335 192850 211677 235104 262090 284882 308658 4430\n"]; handshakeCode = 3; [self sendToOutputStream:query]; } } else if (handshakeCode == 3) { //now, we are reading out the matches: if ([[outputComponents objectAtIndex:0] intValue] == 200) //found exact match: { NSLog(@"Found exact match"); selectedGenre = [outputComponents objectAtIndex:1] ; selectedDiscId = [outputComponents objectAtIndex:2]; if (selectedGenre && selectedDiscId) { //send off the request to get the entry: NSString *query = [NSString stringWithFormat:@"cddb read %@ %@\n", selectedGenre, selectedDiscId]; [self sendToOutputStream:query]; handshakeCode = 4; } } } } break; } case NSStreamEventEndEncountered: { NSLog(@"Stream event end encountered"); break; } case NSStreamEventErrorOccurred: { NSLog(@"Stream error occurred"); break; } case NSStreamEventHasSpaceAvailable: { NSLog(@"Stream has space available"); if (aStream == oStream) { if (handshakeCode == 0) { handshakeCode = 1; [self sendToOutputStream:@"cddb hello stackoverflow localhost.localdomain test .01BETA\n"]; } } break; } } } -(void)sendToOutputStream:(NSString*)command { const uint8_t *rawCommand = (const uint8_t *)[command UTF8String]; [oStream write:rawCommand maxLength:strlen(rawCommand)]; NSLog(@"Sent command: %@",command); } @end int main (int argc, const char * argv[]) { NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init]; stackoverflow *test = [[stackoverflow alloc] init]; [test getMatchesFromFreeDB]; NSRunLoop *runLoop = [NSRunLoop currentRunLoop]; [runLoop run]; [pool drain]; return 0; } Any help is much appreciated! Thanks

    Read the article

  • Pass elements of a list as arguments to a function in python

    - by Wilduck
    I'm building a simple interpreter in python and I'm having trouble handling differing numbers of arguments to my functions. My current method is to get a list of the commands/arguments as follows. args = str(raw_input('>> ')).split() com = args.pop(0) Then to execute com, I check to see if it is in my dictionary of command- code mappings and if it is I call the function I have stored there. For a command with no arguments, this would look like: commands[com]() However, if a command had multiple arguments, I would want this: commands[com](args[0],args[1]) Is there some trick where I could pass some (or all) of the elements of my arg list to the function that I'm trying to call? Or is there a better way of implementing this without having to use python's cmd class?

    Read the article

  • Disabling scoring in Lucene(.NET)

    - by user72185
    Hi, When searching, is there a way to disable scoring for any query? The scenario is that the user refines his query by trying different combinations of words, phrases etc., and needs realtime (well, reasonably fast at least) responses on the number of hits. Search time slows down a lot when there are millions of hits due to scoring, but the user really doesn't care about all these documents. As soon as he sees there are 1M+ hits he will start adding additional words to the query. A "Sort by relevance" option would allow him to do this quickly, while turning scoring back on when the number of hits is reasonable. Is this possible? I'm using Lucene.NET 2.9.2 but AFAIK it is identical to the Java version.

    Read the article

  • How to avoid SQLiteException locking errors

    - by TheArchedOne
    I'm developing an android app. It has multiple threads reading from and writing to the android SQLite db. I am receiving the following error: SQLiteException: error code 5: database is locked I understand the SQLite locks the entire db on inserting/updating, but these errors only seems to happen when inserting/updating while I'm running a select query. The select query returns a cursor which is being left open quite a wile (a few seconds some times) while I iterate over it. If the select query is not running, I never get the locks. I'm surprised that the select could be locking the db.... is this possible, or is something else going on? What's the best way to avoid such locks? Thanks TAO

    Read the article

  • Problem with order by in LINQ

    - by vikitor
    Hi, I'm passing from the controller an array generated by the next code: public ActionResult GetClasses(bool ajax, string kingdom) { int _kingdom = _taxon.getKingdom(kingdom); var query = (from c in vwAnimalsTaxon.All() orderby c.ClaName select new { taxRecID = c.ClaRecID, taxName = c.ClaName }).Distinct(); return Json(query, JsonRequestBehavior.AllowGet); } The query List should be ordered, but it doesn't work, I get the names of the classes ordered wrong in the array, because I've seen it debugging that the names are not ordered.The view is just a dropdownbox loaded automatically, so I'm almost sure the problem is with the action. Do you see anything wrong?Am I missing something?

    Read the article

  • Detect if a table contains a column in Android/sqlite

    - by sandis
    So I have an app on the market, and with an update I want to add some columns to the database. No problems so far. But I want to detect if the database in use is missing these columns, and add them if this is the case. I need this to be done dynamically and not just after the update to the new version, because the application is supposed to still be able to import older databases. Normally I would be able to use the PRAGMA query, but Im not sure how to do this with Android. I cant use execSQL since it is a query, and I cant figure out how to use PRAGMA with the query()-function. Ofcourse I could just catch exceptions and then add the column, or always add the columns to each table before I start to work with it, but that is not a neat solution. Cheers,

    Read the article

  • How to implement a left outer join in the Entity Framework.

    - by user206736
    I have the following SQL query:- select distinct * from dbo.Profiles profiles left join ProfileSettings pSet on pSet.ProfileKey = profiles.ProfileKey left join PlatformIdentities pId on pId.ProfileKey = profiles.Profilekey I need to convert it to a LinqToEntities expression. I have tried the following:- from profiles in _dbContext.ProfileSet let leftOuter = (from pSet in _dbContext.ProfileSettingSet select new { pSet.isInternal }).FirstOrDefault() select new { profiles.ProfileKey, Internal = leftOuter.isInternal, profiles.FirstName, profiles.LastName, profiles.EmailAddress, profiles.DateCreated, profiles.LastLoggedIn, }; The above query works fine because I haven't considered the third table "PlatformIdentities". Single left outer join works with what I have done above. How do I include PlatformIdentities (the 3rd table) ? I basically want to translate the SQL query I specified at the beginning of this post (which gives me exactly what I need) in to LinqToEntities. Thanks

    Read the article

  • Installing Sass on Gentoo

    - by iyrag
    I've been trying to install Sass on Gentoo, but it hasn't been going too well. Unfortunately, the latest version of Sass in portage is 3.1.21. What I want to use Sass for requires at least Sass 3.2, which is available through rubygems. What I've tried: emerge dev-ruby/sass (installs an old version) gem install sass The second command appears to install the Sass gem. However, I do not use Rails or Ruby in any other aspect apart from Sass, so the gem appears useless to me. In addition, I do not know where gems are installed to or how to use them (I'm a ruby noob.) All I want to do is call sass from the command line. Are there any ways to obtain an up-to-date version of Sass which I can just use from the command line? Cheers.

    Read the article

  • bash script to check running process

    - by elasticsecurity
    I wrote a bash-script to check if a process is running. It doesn't work since the ps command always returns exit code 1. When I run the ps command from the command-line, the $? is correctly set, but within the script it is always 1. Any idea? #!/bin/bash SERVICE=$1 ps -a | grep -v grep | grep $1 > /dev/null result=$? echo "exit code: ${result}" if [ "${result}" -eq "0" ] ; then echo "`date`: $SERVICE service running, everything is fine" else echo "`date`: $SERVICE is not running" fi Bash version: GNU bash, version 3.2.25(1)-release (x86_64-redhat-linux-gnu)

    Read the article

  • Fetch data from multiple MySQL tables

    - by Jon McIntosh
    My two tables look like this: TABLE1 TABLE2 +--------------------+ +--------------------+ |field1|field2|field3| and |field2|field4|field5| +--------------------+ +--------------------+ I am already running a SELECT query for TABLE1, and assorting all of the data into variables: $query = "SELECT * FROM TABLE1 WHERE field2 = 2"; $result = mysql_query($query); $num_rows = mysql_num_rows($result); if((!is_bool($result) || $result) && $num_rows) { while($row = mysql_fetch_array($result)) { $field1 = $row['field1']; $field2 = $row['field2']; $field3 = $row['field3']; } } What I want to do is get the data from 'field4' on TABLE2 and add it to my variables. I would want to get field4 WHERE field2 = 2

    Read the article

  • Paging enormous tables on DB2

    - by grenade
    We have a view that, without constraints, will return 90 million rows and a reporting application that needs to display paged datasets of that view. We're using nhibernate and recently noticed that its paging mechanism looks like this: select * from (select rownumber() over() as rownum, this_.COL1 as COL1_20_0_, this_.COL2 as COL2_20_0_ FROM SomeSchema.SomeView this_ WHERE this_.COL1 = 'SomeValue') as tempresult where rownum between 10 and 20 The query brings the db server to its knees. I think what's happening is that the nested query is assigning a row number to every row satisfied by the where clause before selecting the subset (rows 10 - 20). Since the nested query will return a lot of rows, the mechanism is not very efficient. I've seen lots of tips and tricks for doing this efficiently on other SQL platforms but I'm struggling to find a DB2 solution. In fact an article on IBM's own site recommends the approach that nhibernate has taken. Is there a better way?

    Read the article

  • MySQLNonTransientConnectionException in jdbc program at run time

    - by Sunil Kumar Sahoo
    Hi I have created jdbc mysql connection. my program works fine for simple execution of query. But if i run the same program for more than 10 hour and execute query then i receives the following mysql exception. I have not used close() method anywhere. i created database connection and opened it forever and always execute query. there is no where that i explicitly mentioned timeout for connection. i am unable to identify the problem com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: Connection.close() has already been called. Invalid operation in this state. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: No operations allowed after statement closed. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) sample code for database connection: String driver = PropertyReader.getDriver(); String url = dbURLPath; Class.forName(driver); connectToServerDB = DriverManager.getConnection(url); connectToServerDB.setAutoCommit(false);

    Read the article

  • How do I automatically rebuild the Sphinx index under django-sphinx?

    - by Apreche
    I just setup django-sphinx, and it is working beautifully. I am now able to search my model and get amazing results. The one problem is that I have to build the index by hand using the indexer command. That means every time I add new content, I have to manually hit the command line to rebuild the search index. That is just not acceptable. I could make a cron job that automatically runs the indexer command every so often, but that's far from optimal. New data won't be indexed until the cron runs again. In addition, the indexer will run unnecessarily most times as my site doesn't have data being added very often. How do I set it up so that the Sphinx index will automatically rebuild itself whenever data is added to or modified in a searchable django model?

    Read the article

  • java OutofMemoryError

    - by dqm
    I am running the following command on unix box. java -Xms3800m -Xmx3800m org.apache.xalan.xslt.Process -out Cust.txt -in test13l.xml -xsl CustDetails.xsl It is a java command, which calls Xalan processor to parse through the xml file (test131.xml) using the xsl stylesheet (CustDetails.xsl) and returns Cust.txt. The command works fine and the output is generated. It takes 12 minutes to process an xml file size of 1.1 GB. It takes 22 minutes to process a file size of 1.44 GB. However, when I try to process a file size of 1.66 GB, it errors out with the following message: (Location of error unknown)XSLT Error (java.lang.OutOfMemoryError): null I have increased the java heap size to 3800 not sure what I can do more. Many thanks for your help.

    Read the article

  • Table alias has an effect with execution time?

    - by RedFux227
    So I have this query : SELECT A.e_cd "INFLATE" , A.ccgpt "1" , ( SELECT J.comp FROM JBB J WHERE J.e_cd=A.e_cd AND emplo_RCD=:1 AND J.Dte = ( SELECT MAX(J1.Dte) FROM JBB J1 WHERE J.e_cd=J1.e_cd AND TO_CHAR(J1.Dte,'MM') <=A.mth_to AND TO_CHAR(J1.Dte,'YYYY') <=A.year ) AND J.seq = ( SELECT MAX(J2.seq) FROM PS_JOB J2 WHERE J2.e_cd=J.e_cd AND J2.Dte=J.Dte)) "Company" FROM PPS A WHERE orcd=:2 AND rcd=:3 AND tx_cd=:4 AND year=:5 With this one : SELECT A.e_cd "INFLATE" , A.ccgpt "1" , ( SELECT J.comp FROM JBB J WHERE J.e_cd=A.e_cd AND emplo_RCD=:1 AND J.Dte = ( SELECT MAX(J1.Dte) FROM JBB J1 WHERE J.e_cd=J1.e_cd AND TO_CHAR(J1.Dte,'MM') <=A.mth_to AND TO_CHAR(J1.Dte,'YYYY') <=A.year ) AND J.seq = ( SELECT MAX(J1.seq) **FROM PS_JOB J1 WHERE J1.e_cd=J.e_cd AND J1.Dte=J.Dte**)) "Company" FROM PPS A WHERE orcd=:2 AND rcd=:3 AND tx_cd=:4 AND year=:5 The 1st query run for about 3.20 sec with buffer gets 8,134 and for the 2nd query it run for about 1.73 sec with buffer gets 7,006. So, is the table alias somehow has an impact on the execution time / buffer gets? Thanks in advance!

    Read the article

  • PHP Error - Login Script

    - by gamerzfuse
    I am creating a new login script/members directory. I am creating it from scratch without any frameworks (advice on this matter would also be appreciated). The situation: // Look up the username and password in the database $query = "SELECT admin_id, username FROM admin WHERE adminname = '$admin_user' AND password = SHA1('$admin_pass')"; $data = mysqli_query($dbc, $query); if (mysqli_num_rows($data) == 1) { This bit of code keeps giving me an error (the last line in particular): Warning: mysqli_num_rows() expects parameter 1 to be mysqli_result, boolean given in /home8/craighoo/public_html/employees/security/dir_admin.php on line 20 When echoing the query I get: SELECT admin_id, username FROM admin WHERE adminname = 'admin' AND password = SHA1('tera#byte') Thanks in advance!

    Read the article

  • DOS Batch script loop

    - by Tom J Nowell
    I need to execute a command 100-200 times, so far my research indicates that I would either have to copy paste 100 copies of this command, OR use a FOR loop, but the for loop expects a list of items, hence I would need 200 files to operate on, or a list of 200 items, hence defeating the point. I would really not have to write a C program and go through the length of documenting why I had to write another program to execute my program for test purposes. Modification of my program itself is also no an option. So, given a command how would I execute it times via a DOS batch script?

    Read the article

  • INNER JOIN vs LEFT JOIN performance in SQL Server

    - by Ekkapop
    I've created SQL command that use INNER JOIN for 9 tables, anyway this command take a very long time (more than five minutes). So my folk suggest me to change INNER JOIN to LEFT JOIN because the performance of LEFT JOIN is better, at first time its despite what I know. After I changed, the speed of query is significantly improve. I want to know why LEFT JOIN is faster than INNER JOIN? My SQL command look like below: SELECT * FROM A INNER JOIN B ON ... INNER JOIN C ON ... INNER JOIN D and so no

    Read the article

  • Searching for Records

    - by 47
    I've come up with a simple search view to search for records in my app. The user just enters all parameters in the search box then all this is matched against the database, then results are returned. One of these fields is the phone number....now in the database it's stored in the format XXX-XXX-XXX. A search, for example, for "765-4321" pull up only "416-765-4321...however I want it to return both "416-765-4321" and "4167654321" My view is as below: def search(request, page_by=None): query = request.GET.get('q', '') if query: term_list = query.split(' ') q = Q(first_name__icontains=term_list[0]) | Q(last_name__icontains=term_list[0]) | Q(email_address__icontains=term_list[0]) | Q(phone_number__icontains=term_list[0]) for term in term_list[1:]: q.add((Q(first_name__icontains=term) | Q(last_name__icontains=term) | Q(email_address__icontains=term) | Q(phone_number__icontains=term)), q.connector) results = Customer.objects.filter(q).distinct() all = results.count() else: results = [] if 'page_by' in request.GET: page_by = int(request.REQUEST['page_by']) else: page_by = 50 return render_to_response('customers/customers-all.html', locals(), context_instance=RequestContext(request))

    Read the article

  • LINQ To SQL Wildcards

    - by mcass20
    How can I build in wildcards to my LINQ To SQL lambda expression? This is what I have currently: var query = from log in context.Logs select log; foreach (string filter in CustomReport.ExtColsToFilter) { string tempFilter = filter; query = query.Where(Log => Log.FormattedMessage.Contains(tempFilter)); } This works fine up until I try and pass wildcards in the filter string. I'm experimenting with SqlMethods.Like() but to no avail. The filters above look like this: "<key>NID</key><value>mcass</value>". I'd like to be able to pass filters like this: "<key>NID</key><value>%m%</value>"

    Read the article

  • How do I perform a batch insert in Django?

    - by Thierry Lam
    In mysql, you can insert multiple rows to a table in one query for n 0: INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9), ..., (n-2, n-1, n); Is there a way to achieve the above with Django queryset methods? Here's an example: values = [(1, 2, 3), (4, 5, 6), ...] for value in values: SomeModel.objects.create(first=value[0], second=value[1], third=value[2]) I believe the above is calling an insert query for each iteration of the for loop. I'm looking for a single query, is that possible in Django?

    Read the article

  • Comparing 2 columns in the same table with the "Like" function

    - by Vic
    I'm trying to come up with a way to query the values in two different columns in the same table where the result set will indicate instances where the value of columnB doesn't contain the value of columnA. For example, my "Nodes" table contains columns "NodeName" and "DNS". The values should look similar to the following: NodeName DNS Router1 Router1.mydomain.com I want to run a query to show which rows have a DNS value that does not contain (or begin with) the value of the NodeName field. I think the query should function something similar to the following, but obviously I'm missing something with regard to the use of "Like" in this situation. SELECT NodeName, DNS WHERE DNS NOT LIKE 'NodeName%' I'm using SQL Server 2005, and any suggestions would be greatly appreciated... :)

    Read the article

  • PATH and CLASSPATH in Windows7 7 / Eclipse

    - by Richard Knop
    So I would like to set PATH and CLASSPATH system variables so I can use javac and java commands in the command line. I can just compile and run java programs in eclipse but I would also like to be able to run them through command line. This is where I have Java installed: C:\Program Files (x86)\Java jdk1.6.0_20 jre6 And this is where eclipse stores my Java projects: D:\java-projects HelloWorld bin HelloWorld.class src HelloWorld.java I have set up the PATH and CLASSPATH variables like this: PATH: C:\Program Files (x86)\Java\jdk1.6.0_20\bin CLASSPATH: D:\java-projects But it doesn't work. When I write: java HelloWorld Or: java HelloWorld.class I get error like this: Exception in thread “main” java.lang.NoClassDefFoundError: HelloWorld The error is longer, that's just the first line. How can I fix this? I'm mainly interested to be able to run compiled .class programs from the command line, I can do compiling in the eclipse.

    Read the article

  • Django equivalent for latest entry for each user

    - by paul-ogrady
    Hi, I'm surprised this question hasn't come up. Couldn't find much on the web. Using Entry.objects.latest('created_at') I can recover the latest entry for all Entry objects, but say if I want the latest entry for each user? This is something similar to an SQL latest record query. But how do I achieve this using the ORM? Here is my approach I'm wondering if it is the most efficient way to do what I want. First I perform a sub query: Objects are grouped by user and the Max (latest) created_by field is returned for each user (created_at__max) I then filter Entry objects based on the results in the subquery and get the required objects. Entry.objects.filter(created_at__in=Entry.objects.values('user').annotate(Max('created_at')).values_list('created_at__max')) or using a manager: class UsersLatest(models.Manager): def get_query_set(self): return Super(UsersLatest,self).get_query_set().filter(created_at__in=self.model.objects.values('user').annotate(Max('created_at')).values_list('created_at__max')) Is there a more efficient way? possibly without sub query? Thanks, Paul

    Read the article

< Previous Page | 491 492 493 494 495 496 497 498 499 500 501 502  | Next Page >