Search Results

Search found 65101 results on 2605 pages for 'big data'.

Page 111/2605 | < Previous Page | 107 108 109 110 111 112 113 114 115 116 117 118  | Next Page >

  • Best practice? - Array/Dictionary as a Core Data Entity Attribute

    - by Run Loop
    I am new to Core Data. I have noticed that collection types are not available as attribute types and would like to know what the most efficient way is of storing array/dictionary type data as an attribute (e.g. the elements that make up an address like street, city, etc. does not require a separate entity and is more conveniently stored as a dictionary/array than separate attributes/fields). Thank you.

    Read the article

  • Validate Data Binding with Silverlight 3

    In this fourth part of the series we will take a look at how to validate data binding. We ll start by explaining why this is important and then walk through a step-by-step process that shows you how to do it. The next and final part of the series will discuss data conversion.... Test Drive the Next Wave of Productivity Find Microsoft Office 2010 and SharePoint 2010 trials, demos, videos, and more.

    Read the article

  • C# Creating thumbnail (low quality and big size problem)

    - by ile
    public void CreateThumbnail(Image img1, Photo photo, string targetDirectoryThumbs) { int newWidth = 700; int newHeight = 700; double ratio = 0; if (img1.Width > img1.Height) { ratio = img1.Width / (double)img1.Height; newHeight = (int)(newHeight / ratio); } else { ratio = img1.Height / (double)img1.Width; newWidth = (int)(newWidth / ratio); } Image bmp1 = img1.GetThumbnailImage(newWidth, newHeight, null, IntPtr.Zero); bmp1.Save(targetDirectoryThumbs + photo.PhotoID + ".jpg"); img1.Dispose(); bmp1.Dispose(); } I've put 700px so that you can have better insight of the problem. Here is original image and resized one. Any good recommendation? Thanks, Ile

    Read the article

  • Search in big text log files

    - by 0xFF
    Hi, let's say you have an game server which creating text log files of gamers actions, and from time to time you need to lookup something in those logs files (like investigating an scam or loosing an item). Just for example you have 100 files and each file have size between 20MB and 50MB - How you would search them quickly? What I have already tried to do is create several threads and each invidual thread will map his own file to memory (let say memory should not be problem if it not exceed 500MB of ram) perform search here, result was something around 1 second per file : File:a26.log - read in: 0.891, lines: 625282, matches: 78848 Is there better way how to do that ? - because it seems to me kinda slow. thanks. (java was used for this case)

    Read the article

  • Let&rsquo;s keep informed with &ldquo;Data Explorer&rdquo;

    - by Luca Zavarella
    At Pass Summit 2011 a new project was announced. It’s a Microsoft SQL Azure Lab and its codename is Microsoft “Data Explorer”. According to the official blog (http://blogs.msdn.com/b/dataexplorer/), this new tool provides an innovative way to acquire new knowledge from the data that interest you. In a nutshell, Data Explorer allows you to combine data from multiple sources, to publish and share the result. In addition, you can generate data streams in the RESTful open format (Open Data Protocol), and they can then be used by other applications. Nonetheless we can still use Excel or PowerPivot to analyze the results. Sources can be varied: Excel spreadsheets, text files, databases, Windows Azure Marketplace, etc.. For those who are not familiar with this resource, I strongly suggest you to keep an eye on the data services available to the Marketplace: https://datamarket.azure.com/browse/Data To tell the truth, as I read the above blog post, I was tempted to think of the Data Explorer as a "SSIS on Azure" addressed to the Power User. In fact, reading the response from Tim Mallalieu (Group Program Manager of Data Explorer) to the comment made to his post, I had a positive response to my first impression: “…we originally thinking of ourselves as Self-Service ETL. As we talked to more folks and started partnering with other teams we realized that would be an area that we can add value but that there were more opportunities emerging.” The typical operations of the ETL phase ( processing and organization of data in different formats) can be obtained thanks to Data Explorer Mashup. This is an image of the tool: The flexibility in the manipulation of information is given by Data Explorer Formula Language. This is a formula-based Excel-style specific language: Anyone wishing to know more can check the project page in addition to aforementioned blog: http://www.microsoft.com/en-us/sqlazurelabs/labs/dataexplorer.aspx In light of this new project, there is no doubt about the intention of Microsoft to get closer and closer to the Power User, providing him flexible and very easy to use tools for data analysis. The prime example of this is PowerPivot. The question that remains is always the same: having in a company more Power User will implicitly mean having different data models representing the same reality. But this would inevitably lead to anarchical data management... What do you think about that?

    Read the article

  • Which has a faster data transfer rate? WIFI (tablet or cell phone, not LTE) or MicroSD (Class 10)?

    - by techaddict
    Which of the two methods of dta transfer trasfers data at a faster rate for smartphones and tablets? Standard WIFI, or MicroSD Cards? I wonder if it would be actually faster to access data on external storage then it would be to have the MicroSD card in my smartphone or tablet. Currently I have a class 10 32GB MicroSD card in my cell phone. I am looking to get the new Google Nexus tablet but it does not offer expandable internal storage. I wonder if that's really a detriment; because if WIFI is faster than MicroSD, then it would matter almost none at all that you couldn't expand the storage internally. If the case is that WIFI is faster, and people caught onto this, then people could save a lot of money on lower memory ipads/iphones/ipods, tablets, and smartphones!

    Read the article

  • Google I/O 2010 - Batch data processing with App Engine

    Google I/O 2010 - Batch data processing with App Engine Google I/O 2010 - Batch data processing with App Engine App Engine 201 Mike Aizatsky In this session, attendees will learn how to write map() functions, how to do simple reduce() operations, how to run these over large datasets, and how App Engine is used to accomplish such parallelism. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 6 0 ratings Time: 38:45 More in Science & Technology

    Read the article

  • power and modulo on the fly for big numbers

    - by user unknown
    I raise some basis b to the power p and take the modulo m of that. Let's assume b=55170 or 55172 and m=3043839241 (which happens to be the square of 55171). The linux-calculator bc gives the results (we need this for control): echo "p=5606;b=55171;m=b*b;((b-1)^p)%m;((b+1)^p)%m" | bc 2734550616 309288627 Now calculating 55170^5606 gives a somewhat large number, but since I have to do a modulooperation, I can circumvent the usage of BigInt, I thought, because of: (a*b) % c == ((a%c) * (b%c))%c i.e. (9*7) % 5 == ((9%5) * (7%5))%5 => 63 % 5 == (4 * 2) %5 => 3 == 8 % 5 ... and a^d = a^(b+c) = a^b * a^c, therefore I can divide b+c by 2, which gives, for even or odd ds d/2 and d-(d/2), so for 8^5 I can calculate 8^2 * 8^3. So my (defective) method, which always cut's off the divisor on the fly looks like that: def powMod (b: Long, pot: Int, mod: Long) : Long = { if (pot == 1) b % mod else { val pot2 = pot/2 val pm1 = powMod (b, pot, mod) val pm2 = powMod (b, pot-pot2, mod) (pm1 * pm2) % mod } } and feeded with some values, powMod (55170, 5606, 3043839241L) res2: Long = 1885539617 powMod (55172, 5606, 3043839241L) res4: Long = 309288627 As we can see, the second result is exactly the same as the one above, but the first one looks quiet different. I'm doing a lot of such calculations, and they seem to be accurate as long as they stay in the range of Int, but I can't see any error. Using a BigInt works as well, but is way too slow: def calc2 (n: Int, pri: Long) = { val p: BigInt = pri val p3 = p * p val p1 = (p-1).pow (n) % (p3) val p2 = (p+1).pow (n) % (p3) print ("p1: " + p1 + " p2: " + p2) } calc2 (5606, 55171) p1: 2734550616 p2: 309288627 (same result as with bc) Can somebody see the error in powMod?

    Read the article

  • How do I write a Java text file viewer for big log files

    - by Hannes de Jager
    I am working on a software product with an integrated log file viewer. Problem is, its slow and unstable for really large files because it reads the whole file into memory when you view a log file. I'm wanting to write a new log file viewer that addresses this problem. What are the best practices for writing viewers for large text files? How does editors like notepad++ and VIM acomplish this? I was thinking of using a buffered Bi-directional text stream reader together with Java's TableModel. Am I thinking along the right lines and are such stream implementations available for Java?

    Read the article

  • How to download yahoo historical stock data into xls. format via matlab?

    - by Noob_1
    I have an xls sheet called Tickers (matrix 1 column 500 rows) with yahoo tickers. I want matlab to download the historical data for last 5 years for each stock ticker into a separate xls spreadsheet and save it in a given directory with title of the sheet = ticker. So that means i want a code that will create and save 500 tickers worth of data in 500 separate spreadhseets :) can anyone help or direct?

    Read the article

  • How to find crc32 of big files ?

    - by Arsheep
    The PHP's crc32 support string as input.And For a file , below code will work OFC. crc32(file_get_contents("myfile.CSV")); But if file goes huge (2 GB) it might raise out of memory Fatal error. So any way around to find checksum of huge files ?

    Read the article

  • Refactoring. Your way to reduce code complexity of big class with big methods

    - by Andrew Florko
    I have a legacy class that is rahter complex to maintain: class OldClass { method1(arg1, arg2) { ... 200 lines of code ... } method2(arg1) { ... 200 lines of code ... } ... method20(arg1, arg2, arg3) { ... 200 lines of code ... } } methods are huge, unstructured and repetitive (developer loved copy/paste aprroach). I want to split each method into 3-5 small functions, whith one pulic method and several helpers. What will you suggest? Several ideas come to my mind: Add several private helper methods to each method and join them in #region (straight-forward refactoring) Use Command pattern (one command class per OldClass method in a separate file). Create helper static class per method with one public method & several private helper methods. OldClass methods delegate implementation to appropriate static class (very similiar to commands). ? Thank you in advance!

    Read the article

  • Performance improvement to a big if clause in SQL Server function

    - by Miles D
    I am maintaining a function in SQL Server 2005, that based on an integer input parameter needs to call different functions e.g. IF @rule_id = 1 -- execute function 1 ELSE IF @rule_id = 2 -- execute function 2 ELSE IF @rule_id = 3 ... etc The problem is that there are a fair few rules (about 100), and although the above is fairly readable, its performance isn't great. At the moment it's implemented as a series of IF's that do a binary-chop, which is much faster, but becomes fairly unpleasant to read and maintain. Any alternative ideas for something that performs well and is fairly maintainable?

    Read the article

  • iPhone Simulating App Update at home before going out in the big bad world

    - by Aran Mulholland
    this is a follow on from this question and the link given it seems that when an app is updated all of the files in the documents directory are copied into the updated apps documents directory and also anything in Library/Preferences. Whats the best way to simulate this for testing purposes? Just copy the files in ApplicationSupport/iPhone Simulator etc? or has anyone developped any funky techniques for testing this.

    Read the article

  • Chapter 3: Data-Tier Applications

    With the release of Microsoft SQL Server 2008 R2, the SQL Server Manageability team addressed these struggles by introducing support for data-tier applications to help streamline the deployment, management, and upgrade of database applications. A data tier application, also referred to as a DAC, is a single unit of deployment that contains all the elements used by an application, such as the database application schema, instance level objects, associated database objects, files and scripts, and even a manifest defining the organization’s deployment requirements.

    Read the article

  • Big time Leaking in Objective-C Category

    - by Daniel Amitay
    I created a custom NSString Category which lets me find all strings between two other strings. I'm now running into the problem of finding that there are a lot of kBs leaking from my script. Please see code below: #import "MyStringBetween.h" @implementation NSString (MyStringBetween) -(NSArray *)mystringBetween:(NSString *)aString and:(NSString *)bString; { NSAutoreleasePool *autoreleasepool = [[NSAutoreleasePool alloc] init]; NSArray *firstlist = [self componentsSeparatedByString:bString]; NSMutableArray *finalArray = [[NSMutableArray alloc] init]; for (int y = 0; y < firstlist.count - 1 ; y++) { NSString *firstObject = [firstlist objectAtIndex:y]; NSMutableArray *secondlist = [firstObject componentsSeparatedByString:aString]; if(secondlist.count > 1){ [finalArray addObject:[secondlist objectAtIndex:secondlist.count - 1]]; } } [autoreleasepool release]; return finalArray; } @end I admit that I'm not super good at releasing objects, but I had believed that the NSAutoreleasePool handled things for me. The line that is leaking: NSMutableArray *secondlist = [firstObject componentsSeparatedByString:aString]; Manually releasing secondlist raises an exception. Thanks in advance!

    Read the article

  • Is there a way to visualize records stored in an iPhone app via Core Data?

    - by Justin Searls
    I have an app which, for good reasons, can only be debugged on a device. I'm using Core Data for the first time, and I'd like to be able to easily inspect the records that are stored by the app on the device. I imagine that Core Data is by default backed by SQLite on the iPhone, so this question might be as simple as asking: "What's the easiest way to extract the SQLite database for an app installed by Xcode without jailbreaking it?" Any experience someone could lend regarding this would be greatly appreciated.

    Read the article

  • Breaking One Big Graphic of MutablePaths into CAShapeLayers

    - by StackOverFlowRider
    I have a class called GraphicView that takes a Graphic object and draws it in its drawRect method. This Graphic object is basically an array of mutablePaths that comprise an icon that I want drawn. For performance and other issues, I was thinking of taking this icon that is comprised of mutablePaths, and dividing it into a bunch of CAShapeLayers. I'm wondering is this possible? Considering the points for the mutablePaths of the icon are all interwoven together (ie the icon was initially an SVG file that I converted to code), is it possible to divide different parts of the icon into CAShapeLayers, and reassemble them all together when assigning to the views layer? If so how would it be done? If I assign them as sublayers to a CALayer or CAShapeLayer, will it understand to mesh them all together?

    Read the article

  • What's the big deal with brute force on hashes like MD5

    - by Jan Kuboschek
    I just spent some time reading http://stackoverflow.com/questions/2768248/is-md5-really-that-bad (I highly recommend!). In it, it talks about hash collisions. Maybe I'm missing something here, but can't you just encrypt your password using, say, MD5 and then, say, SHA-1 (or any other, doesn't matter.) Wouldn't this increase the processing power required to brute-force the hash and reduce the possibility of collision?

    Read the article

  • JPA GeneratedValue with GenerationType.TABLE does a big jump after jvm restart

    - by joeduardo
    When I start my server and add an entry, the generated id will start with 1, 2, so on and so forth. After a restart, adding an entry would generate an id like 32,xxx. Another restart and adding of entry would generate an id like 65,xxx. I don't know why this is happening. Here's a snippet of the annotation I'm using for my id. I'm using Hibernate. @Id @GeneratedValue(strategy = GenerationType.TABLE) private Long id;

    Read the article

  • Php, passing data between pages without using the url?

    - by terabytest
    I have a php page that has a form that asks for an e-mail. When you press the send button, it gets to another php page, which gets the form data and does its stuff. I need to then be able to go back to the old page (the one that contained the form) and give it some data so that it will be able to change itself and say "You've sent your e-mail successfully, and will not display the form. How do I do it?

    Read the article

< Previous Page | 107 108 109 110 111 112 113 114 115 116 117 118  | Next Page >