Search Results

Search found 60085 results on 2404 pages for 'data flow diagrams'.

Page 102/2404 | < Previous Page | 98 99 100 101 102 103 104 105 106 107 108 109  | Next Page >

  • Seeking for faster $.(':data(key)')

    - by PoltoS
    I'm writing an extension to jQuery that adds data to DOM elements using el.data('lalala', my_data); and then uses that data to upload elements dynamically. Each time I get new data from the server I need to update all elements having el.data('lalala') != null; To get all needed elements I use an extension by James Padolsey: $(':data(lalala)').each(...); Everything was great until I came to the situation where I need to run that code 50 times - it is very slow! It takes about 8 seconds to execute on my page with 3640 DOM elements var x, t = (new Date).getTime(); for (n=0; n < 50; n++) { jQuery(':data(lalala)').each(function() { x++; }); }; console.log(((new Date).getTime()-t)/1000); Since I don't need RegExp as parameter of :data selector I've tried to replace this by var x, t = (new Date).getTime(); for (n=0; n < 50; n++) { jQuery('*').each(function() { if ($(this).data('lalala')) x++; }); }; console.log(((new Date).getTime()-t)/1000); This code is faster (5 sec), but I want get more. Q Are there any faster way to get all elements with this data key? In fact, I can keep an array with all elements I need, since I execute .data('key') in my module. Checking 100 elements having the desired .data('lalala') is better then checking 3640 :) So the solution would be like for (i in elements) { el = elements[i]; .... But sometimes elements are removed from the page (using jQuery .remove()). Both solutions described above [$(':data(lalala)') solution and if ($(this).data('lalala'))] will skip removed items (as I need), while the solution with array will still point to removed element (in fact, the element would not be really deleted - it will only be deleted from the DOM tree - because my array will still have a reference). I found that .remove() also removes data from the node, so my solution will change into var toRemove = []; for (vari in elements) { var el = elements[i]; if ($(el).data('lalala')) .... else toRemove.push(i); }; for (var ii in toRemove) elements.splice(toRemove[ii], 1); // remove element from array This solution is 100 times faster! Q Will the garbage collector release memory taken by DOM elements when deleted from that array? Remember, elements have been referenced by DOM tree, we made a new reference in our array, then removed with .remove() and then removed from the array. Is there a better way to do this?

    Read the article

  • Ecryptfs: lost passphrase

    - by Sherlock3890
    When i mounted some dir by mount -t ecryptfs private data i entered wrong password. I wrote data in this dir and now i can't mount it. I have no valid password and passphrase (know only the same), but have SIG in /root/.ecryptfs/sig-cache.txt. How i can recover my directory or, at least, "brute it": type many-many passwords like entered when mounting this dir and compare generated sig with existing?

    Read the article

  • Daily tech links for .net and related technologies - June 14-16, 2010

    - by SanjeevAgarwal
    Daily tech links for .net and related technologies - June 14-16, 2010 Web Development ASP.Net MVC 2 Auto Complete Textbox With Custom View Model Attribute & EditorTemplate - Sean McAlinden Localization with ASP.NET MVC ModelMetadata - Kazi Manzur Rashid Securing Dynamic Data 4 (Replay) - Steve Adding Client-Side Script to an MVC Conditional Validator - Simon Ince jQuery: Storing and retrieving data related to elements - Rebecca Murphey Web Design 48 Examples of Excellent Layout in Web Design...(read more)

    Read the article

  • No search data in Goolge Analytics or Webmasters

    - by cjk
    I have a domain that has been registered in Google Webmasters and using Google Analytics for over 4 months. I get lots of analytics data, but am getting no information on Google searches in Webmasters, or Queries in Search Engine Optimisation in Analytics, even though I am getting keywords for traffic coming to my site from search engines. I have a test sub-domain with the same setup (except not HTTPS) that is getting some of this information through, even with much less data and visits. What could be wrong to stop me getting this information?

    Read the article

  • Oracle Technológia Fórum rendezvény, 2010. május 5. szerda

    - by Fekete Zoltán
    Jövo hét szerdán Oracle Technology Fórum napot tartunk, ahol az adatbázis-kezelési és a fejlesztoi szekciókban hallgathatók meg eloadások illetve kaphatók válaszok a kérdésekre. Jelentkezés a rendezvényre. Az adatbázis szekcióban fogok beszélni a Sun Oracle Database Machine / Exadata megoldások technikai gyöngyszemeirol mind a tranzakciós (OLTP) mind az adattárházas (DW) és adatbázis konszolidáció oldaláról. Emellett kiemelem majd az Oracle Data Mining (adatbányászat) és OLAP újdonságait, érdekességeit. Megemlítem majd az Oracle's Data Warehouse Reference Architecture alkalmazási lehetoségeit is.

    Read the article

  • Google I/O 2012 - Storing Data in Google Apps Script

    Google I/O 2012 - Storing Data in Google Apps Script Drew Csillag This session covers the different ways in which developers can store data when using Google Script. We'll break things down by use case, and then show examples of how to use the different options: spreadsheet, Script/User Properties, JDBC connector, and distribution. For all I/O 2012 sessions, go to developers.google.com From: GoogleDevelopers Views: 24 1 ratings Time: 41:48 More in Science & Technology

    Read the article

  • How To Use Regular Expressions for Data Validation and Cleanup

    You need to provide data validation at the server level for complex strings like phone numbers, email addresses, etc. You may also need to do data cleanup / standardization before moving it from source to target. Although SQL Server provides a fair number of string functions, the code developed with these built-in functions can become complex and hard to maintain or reuse. The Future of SQL Server Monitoring "Being web-based, SQL Monitor 2.0 enables you to check on your servers from almost any location" Jonathan Allen.Try SQL Monitor now.

    Read the article

  • Optimized Integration between Oracle Data Integrator and Oracle GoldenGate

    - by Alex Kotopoulis
    The Journalizing Knowledge Module for Oracle GoldenGate's CDC mechanism has just been released as part of the ODI 10.1.3.6_02 patch, and a very useful post from Mark Rittman gives detailed instructions about how to set it up in a sample environment. This integration combines the best of two worlds, the non-invasive, highly performant data replication from Oracle GoldenGate with the innovative and scalable EL-T concept from ODI to perform complex loads into real-time data warehouses. Please also check out the new datasheet about this integration.

    Read the article

  • Utilisez WCF Data Services 1.5 avec Silverlight, par Benjamin Roux

    Citation: Cet article vous présentera comment utiliser Silverlight et WCF Data Services 1.5. Premièrement, pourquoi utiliser Data Services 1.5 ? Tout simplement parce que l'intégration avec Silverlight est grandement améliorée (INotifyPropertyChanged et ObservableCollection, Two-way binding.). c'est par ici N'hésitez pas à laisser vos commentaires ici même

    Read the article

  • Webcast Replay Available: E-Business Suite Data Protection

    - by BillSawyer
    I am pleased to release the replay and presentation for the latest ATG Live Webcast: E-Business Suite Data Protection (Presentation)   Robert Armstrong, Product Strategy Security Architect and Eric Bing, Senior Director discussed the best practices and recommendations for securing your E-Business Suite data.Finding other recorded ATG webcasts The catalog of ATG Live Webcast replays, presentations, and all ATG training materials is available in this blog's Webcasts and Training section.

    Read the article

  • Fetching Data from Multiple Tables using Joins

    Applying normalization to relational databases tends to promote better accuracy of queries, but it also leads to queries that take a little more work to develop, as the data may be spread amongst several tables. In today's article, we'll learn how to fetch data from multiple tables by using joins.

    Read the article

  • How to reinstall Ubuntu keeping my data intact?

    - by Santosh Linkha
    I want to reinstall Ubuntu keeping my data intact. I have 160 GB hardrive (sata or pata I don't know but it's slim and made in China) with a 40 GB ext3 partition, a 4GB swap memory and 3 other partition with a FAT32 file system. I have around 4GB space on my drive where Linux is installed. I'd like to keep the data intact, especially the Downloads folder, desktop, and /var/www; And I no longer have access to any other machines or external storage devices.

    Read the article

  • This Week on the Green Data Center Management Front

    Among the big news this week in green data center management: APC will demonstrate how to provide energy while addressing energy efficiency legislation; Altruent Systems announced it has completed a new energy efficient data center for one of its key clients; and Voonami is unveiling what it claims is the greenest in Utah.

    Read the article

  • How to programmatically bind to a Core Data model?

    - by Dave Gallagher
    Hello. I have a Core Data model, and was wondering if you know how to create a binding to an Entity, programmatically? Normally you use bind:toObject:withKeyPath:options: to create a binding. But I'm having a little difficulty getting this to work with Core Data, and couldn't find anything in Apple's docs regarding doing this programmatically. The Core Data model is simple: An Entity called Book An Attribute of Book called author (NSString) I have an object called BookController. It looks like so: @interface BookController : NSObject { NSString *anAuthor; } @property (nonatomic, retain) NSString *anAuthor; // @synthesize anAuthor; inside @implementation I'd like to bind anAuthor inside BookController, to author inside a Book entity. This is how I'm attempting to wrongly do it (it partially works): // A custom class I made, providing an interface to the Core Data database CoreData *db = [[CoreData alloc] init]; // Creating a Book entity, saving it [db addMocObject:@"Book"]; [db saveMoc]; // Fetching the Book entity we just created NSArray *books = [db fetchObjectsForEntity:@"Book" withPredicate:nil withSortDescriptors:nil]; NSManagedObject *book = [books objectAtIndex:0]; // Creating the binding BookController *bookController = [[BookController alloc] init]; [bookController bind:@"anAuthor" toObject:book withKeyPath:@"author" options:nil]; // Manipulating the binding [bookController setAnAuthor:@"Bill Gates"]; Now, when updating from the perspective of bookController, things don't work quite right: // Testing the binding from the bookController's perspective [bookController setAnAuthor:@"Bill Gates"]; // Prints: "bookController's anAuthor: Bill Gates" NSLog(@"bookController's anAuthor: %@", [bookController anAuthor]); // OK! // ERROR HERE - Prints: "bookController's anAuthor: (null)" NSLog(@"Book's author: %@", [book valueForKey:@"author"]); // DOES NOT WORK! :( When updating from the perspective of the Book entity, things work fine: // ------------------------------ // Testing the binding from the Book's (Entity) perspective (this works perfect) [book setValue:@"Steve Jobs" forKey:@"author"]; // Prints: "bookController's anAuthor: Steve Jobs" NSLog(@"bookController's anAuthor: %@", [bookController anAuthor]); // OK! // Prints: "bookController's anAuthor: Steve Jobs" NSLog(@"Book's author: %@", [book valueForKey:@"author"]); // OK! It appears that the binding is partially working. I can update it on the side of the Model and it propagates up to the Controller via KVO, but if I update it on the side of the Controller, it doesn't trickle down to the Model via KVC. Any idea on what I'm doing wrong? Thanks so much for looking! :)

    Read the article

  • Determining distribution of NULL values

    - by AaronBertrand
    Today on the twitter hash tag #sqlhelp, @leenux_tux asked: How can I figure out the percentage of fields that don't have data ? After further clarification, it turns out he is after what proportion of columns are NULL. Some folks suggested using a data profiling task in SSIS . There may be some validity to that, but I'm still a fan of sticking to T-SQL when I can, so here is how I would approach it: Create a #temp table or @table variable to store the results. Create a cursor that loops through all...(read more)

    Read the article

  • Oracle acquires Pillar Data Systems

    - by nospam(at)example.com (Joerg Moellenkamp)
    So far it was an investment of Larry Ellison, but now it's part of Oracle: Oracle has acquired Pillar Data Systems.. You will find more information in the press release.. As i already smell some of the comments:Pillar Data Systems is majority owned by Oracle CEO Larry Ellison. The evaluation and negotiation of the transaction was led by an independent committee of Oracle's Board of Directors. The transaction is structured as a 100% earn-out with no up-front payment.

    Read the article

  • SQL Server Geography Data Type

    We are working on the migration to SQL Server 2008 and have geospatial data that we would like to move over as well. As part of our application we house information on locations across the globe. Which data type should we use?

    Read the article

  • Oracle Adattárház Referencia Architektúra, a legjobb gyakorlatból

    - by Fekete Zoltán
    Hogyan építsünk adattárházat, hogyan kapcsoljuk össze a rendszereinkkel? Mi legyen az az architektúra, mellyel a legkisebb kockázattal a legbiztosabban érünk célba? Ezekre a kérdésekre kaphatunk választ az Oracle Data Warehouse Reference Architecture leírásból. Letöltheto a következo dokumentum: Enabling Pervasive BI through a Practical Data Warehouse Reference Architecture

    Read the article

  • Stairway to XML: Level 2 - The XML Data Type

    Robert Sheldon describes SQL Server's XML Data Type, and shows that it is as easy to configure a variable, column, or parameter with the XML data type as configuring one of these objects with any other datatype Keep your database and application development in syncSQL Connect is a Visual Studio add-in that brings your databases into your solution. It then makes it easy to keep your database in sync, and commit to your existing source control system. Find out more.

    Read the article

  • SQL SERVER Configure Management Data Collection in Quick Steps T-SQL Tuesday #005

    This article was written as a response to T-SQL Tuesday #005 Reporting.The three most important components of any computer and server are the CPU, Memory, and Hard disk specification. This post talks about how to get more details about these three most important components using the Management Data Collection. Management Data Collection generates the [...]...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Stairway to XML: Level 4 - Querying XML Data

    You can extract a subset of data from an XML instance by using the query() method, and you can use the value() method to retrieve individual element and attribute values from an XML instance. SQL Monitor v3 is even more powerfulUse custom metrics to monitor and alert on data that's most important for your environment, easily imported from our custom metrics site. Find out more.

    Read the article

  • Trying to determine best design for this workflow - c# - 3.0

    - by ltech
    Input Server - files of type jpg,tif, raw,png, mov come in via FTP Each file needs to be watermarked, if applicable, and meta data added to the file Then each file needs to be moved to an orders directory where an order file is generated and then packaged as zip file and moved to processing server. The file names are of [orderid_userid_guid].[jpg|tif|mov|png...] As I expect the volume to grow I dont want to work on one file at a time and move it through the work flow. I would prefer multi threaded/asynchronous if possible..

    Read the article

< Previous Page | 98 99 100 101 102 103 104 105 106 107 108 109  | Next Page >