Search Results

Search found 9064 results on 363 pages for 'big twisty'.

Page 7/363 | < Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >

  • Pack of resources in one big file with XNA

    - by Cristian
    Is it possible to pack all the little .xnb files into one big file? Given the level of abstraction of the XNA Framework I though this would come out of the box but I can't find any well integrated solution. So far the best candidate is XnaZip but in addition to having to compile the resources in a post-build event, and a little trouble porting the game to XBOX I have to rename all the references to resources I have already implemented.

    Read the article

  • It's the Freedom You Big Dummy

    <b>Daniweb:</b> "No one has given his life for Linux but certainly there have been sacrifices. But, like their armed soldier counterparts, it isn't about the sacrifice, it's the freedom you big dummy."

    Read the article

  • Venez nous voir au Forum Oracle Big Data le 5 avril !

    - by Kinoa
    Le Big Data vient de plus en plus souvent au devant de la scène et vous souhaitez en apprendre davantage ? Générés à partir des réseaux sociaux, de capteurs numériques et autres équipements mobiles, les Big Data - autrement dits, d'énormes volumes de données - constituent une mine d'informations précieuses sur vos activités et les comportements de vos clients. Votre challenge aujourd’hui consiste à gérer l’acquisition, l’organisation et la compréhension de ces volumes de données non structurées, et à les intégrer dans votre système d’information. Vous avez des questions ? Ca vous parait complexe ? Alors le Forum Oracle Bid Data organisé par Oracle et Intel est fait pour vous !   Nous aborderons plusieurs points : Accélération du déploiement de Big Data par l'approche intégrée du hardware et du software Mise à disposition de tous les outils nécessaires au processus complet, de l'acquisition des données à la restitution Intégration de Big Data dans votre système d'information pour fournir aux utilisateurs la quintessence de l'information Nous vous avons concocté un programme des plus alléchant pour cette journée du 5 avril : 9h00 Accueil et remise des badges 9h30 Big Data : The Industry View. Are you ready ?Johan Hendrickx, Core Technology Director, Oracle EMEA Keynote : Big Data – Are you ready ? George Lumpkin, Vice President of DW Product Management, Oracle Corporation Acquisition des données dans votre Big Dataavec Hadoop et Oracle NoSQL Pause Organisez et structurez l'information au sein de votre Big Data avec Big Data Connectors et Oracle Data Integrator Tirez parti des analyses des données de votre Big Dataavec Oracle Endeca et Oracle Business Intelligence 13h00 Cocktail déjeunatoire Le nombre de places est limité, pensez à vous inscrire dès maintenant. Lieu :  Maison de la Chimie28 B, rue Saint Dominique 75007 Paris

    Read the article

  • What makes a project big?

    - by Jonny
    Just out of curiosity what's the difference between a small, medium and large size project? Is it measured by lines of code or complexity or what? Im building a bartering system and so far have about 1000 lines of code for login/registration. Even though there's lots of LOC i wouldnt consider it a big project because its not that complex though this is my first project so im not sure. How is it measured?

    Read the article

  • Some tips for working with big data models

    The main goal of this article is to present some tips to help professionals that need to work with complex, big, and hard to understand database models that anyone may came across some day. Join SQL Backup’s 35,000+ customers to compress and strengthen your backups "SQL Backup will be a REAL boost to any DBA lucky enough to use it." Jonathan Allen. Download a free trial now.

    Read the article

  • Big-name School for Undergrad Students

    - by itaiferber
    As a soon-to-be graduating high school senior in the U.S., I'm going to be facing a tough decision in a few months: which college should I go to? Will it be worth it to go to Cornell or Stanford or Carnegie Mellon (assuming I get in, of course) to get a big-name computer science degree, internships, and connections with professors, while taking on massive debt; or am I better off going to SUNY Binghamton (probably the best state school in New York) and still get a pretty decent education while saving myself from over a hundred-thousand dollars worth of debt? Yes, I know questions like this has been asked before (namely here and here), but please bear with me because I haven't found an answer that fits my particular situation. I've read the two linked questions above in depth, but they haven't answered what I want to know: Yes, I understand that going to a big-name college can potentially get me connected with some wonderful professors and leaders in the field, but on average, how does that translate financially? I mean, will good connections pay off so well that I'd be easily getting rid of over a hundred-thousand dollars of debt? And how does the fact that I can get a fifth-years master's degree at Carnegie Mellon play into the equation? Will the higher degree right off the bat help me get a better-paying job just out of college, or will the extra year only put me further into debt? Not having to go to graduate school to get a comparable degree will, of course, be a great financial relief, but will getting it so early give it any greater worth? And if I go to SUNY Binghamton, which is far lesser-known than what I've considered (although if there are any alumni out there who want to share their experience, I would greatly appreciate it), would I be closing off doors that would potentially offset my short-term economic gain with long-term benefits? Essentially, is the short-term benefit overweighed by a potential long-term loss? The answers to these questions all tie in to my final college decision (again, permitting I make it to these schools), so I hope that asking the skilled and knowledgeable people of the field will help me make the right choice (if there is such a thing). Also, please note: I'm in a rather peculiar situation where I can't pay for college without taking out a bunch of loans, but will be getting little to no financial aid (likely federal or otherwise). I don't want to elaborate on this too much (so take it at face value), but this is mainly the reason I'm asking the question. Thanks a lot! It means a lot to me.

    Read the article

  • Showrooming: What's the big deal?

    - by David Dorf
    There's been lots of chatter recently on how retailers will combat showrooming this holiday season.  Best Buy and Target, for example, plan to price-match certain online sites.  But from my perspective, the whole showrooming concept is overblown.  Yes, mobile phones make is easier to comparison-shop, but consumers have been doing that all along.  Retailers have to work hard to merchandise their stores with the right products at the right price with the right promotions.  Its Retail 101. Yeah ok, many websites don't have to charge tax so they have an advantage, but they also have to cover shipping costs. Brick-and-mortar stores have the opportunity to provide expertise, fit, and instant gratification all of which are pretty big advantages. I see lots of studies that claim a large percentage of shoppers are showrooming.  Now I don't do much shopping, but when I do I rarely see anyone scanning UPC codes in the aisles.  If you dig into those studies, the question is usually something like, "have you used your mobile phone to price compare while shopping in the last year."  Well yeah, I did it once -- out of the 20 shopping trips.  And by the way, the in-store price was close enough to just buy the item.  Based on casual observation and informal surveys of friends, showrooming is not the modus-operandi for today's busy shoppers. I never see people showrooming in grocery stores, and most people don't bother for fashion.  For big purchases like appliances and furniture, I bet most people do their research online before entering the store.  The cases where I've done it was to see if a promotion was in fact a good deal.  Or even to make sure the in-store price is the same as the online price for the same brand. So, if you think you're a victim of showrooming, I suggest you look at the bigger picture.  Are you providing an engaging store experience?  Are you allowing customers to shop the way they want to shop, using various touchpoints?  Are you monitoring the competition to ensure prices are competitive?  Are your promotions attracting the right customers? Hubert Jolly, CEO of Best Buy, recently commented that showrooming might just get more people into his stores. "Once customers are in our stores, they're ours to lose."

    Read the article

  • SEO Consulting For Big Brand Companies - 16 Guidelines For SEO Consultants to Beat the Competition

    SEO consulting for a big brand website with tens of thousands of pages needs proven strategies that must be tailored to the specific needs of every web site. An SEO consultant, when selecting between different SEO services, must create an aggressive search engine marketing (SEM) campaign with a meticulous SEO strategy that takes all search engine optimization problems into consideration.

    Read the article

  • ubuntu image size 732 mb - too big for cd

    - by memius
    i have an old pc that can't handle a boot stick install, so i have to create an actual, old fashioned boot cd. however, the image size for ubuntu 12.04 is 732mb, which is too large for cds, which can hold only 700mb. the maintainers of ubuntu 12.04 say the image size will never go over 700mb, and indeed, the download size seemed to be 689mb. Brasero says it won't burn the cd because the file is too big what's going on?

    Read the article

  • Experiments in Big Data Visualization on Maps

    Experiments in Big Data Visualization on Maps Brendan Kenny and Mano Marks continue their series on using the CanvasLayer library and HTML5 APIs to visualize large amounts of data on top of Google maps. This week they look at loading Shapefiles and KML directly in the browser and using WebGL to render their content over a map. From: GoogleDevelopers Views: 0 1 ratings Time: 00:00 More in Science & Technology

    Read the article

  • Performance Tuning in the Age of Big Data

    Database Administrators must now deal with large volumes of data and new forms of high-speed data analysis. If your responsibility includes performance tuning, here are the areas to focus on that will become more and more important in the age of Big Data. Total DeploymentEnjoy easy release management for your .NET apps, services, and databases with Deployment Manager. Get your free Starter edition now

    Read the article

  • Cloud just for hosting big files?

    - by yes123
    I need a solution to store my big files (50MB+ each). Currently I am using an european dedicated server (100MBits) with 8000GB/motnh at 60USD. I would like to use a cloud service that autmatically fetches my files from my server the first time users request it (like a classic cdn) (So I can have all files stored within 1 server) I was looking at Amazon CloudFront and, to get the same bandwidth 8'000 GB/month, I have to pay like 2000 USD vs my 60 USD of my dedicated server. Is there a cheaper alternative?

    Read the article

  • SQL SERVER – master Database Log File Grew Too Big

    - by pinaldave
    Couple of the days ago, I received following email and I find this email very interesting and I feel like sharing with all of you. Note: Please read the whole email before providing your suggestions. “Hi Pinal, If you can share these details on your blog, it will help many. We understand the value of the master database and we take its regular back up (everyday midnight). Yesterday we noticed that our master database log file has grown very large. This is very first time that we have encountered such an issue. The master database is in simple recovery mode; so we assumed that it will never grow big; however, we now have a big log file. We ran the following command USE [master] GO DBCC SHRINKFILE (N'mastlog' , 0, TRUNCATEONLY) GO We know this command will break the chains of LSN but as per our understanding; it should not matter as we are in simple recovery model.     After running this, the log file becomes very small. Just to be cautious, we took full backup of the master database right away. We totally understand that this is not the normal practice; so if you are going to tell us the same, we are aware of it. However, here is the question for you? What operation in master database would have caused our log file to grow too large? Thanks, [name and company name removed as per request]“ Here was my response to them: “Hi [name removed], It is great that you are aware of all the right steps and method. Taking full backup when you are not sure is always a good practice. Regarding your question what could have caused your master database log to grow larger, let me try to guess what could have happened. Do you have any user table in the master database? If yes, this is not recommended and also NOT a good practice. If have user tables in master database and you are doing any long operation (may be lots of insert, update, delete or rebuilding them), then it can cause this situation. You have made me curious about your scenario; do revert back. Kind Regards, Pinal” Within few minutes I received reply: “That was it Pinal. We had one of the maintenance task log tables created in the master table, which had many long transactions during the night. We moved it to newly created database named ‘maintenance’, and we will keep you updated.” I was very glad to receive the email. I do not suggest that any user table should be created in the master database. It should be left alone from user objects. Now here is the question for you – can you think of any other reason for master log file growth? Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Backup and Restore, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Bridging the Gap in Cloud, Big Data, and Real-time

    - by Dain C. Hansen
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} With all the buzz of around big data and cloud computing, it is easy to overlook one of your most precious commodities—your data. Today’s businesses cannot stand still when it comes to data. Market success now depends on speed, volume, complexity, and keeping pace with the latest data integration breakthroughs. Are you up to speed with big data, cloud integration, real-time analytics? Join us in this three part blog series where we’ll look at each component in more detail. Meet us online on October 24th where we’ll take your questions about what issues you are facing in this brave new world of integration. Let’s start first with Cloud. What happens with your data when you decide to implement a private cloud architecture? Or public cloud? Data integration solutions play a vital role migrating data simply, efficiently, and reliably to the cloud; they are a necessary ingredient of any platform as a service strategy because they support cloud deployments with data-layer application integration between on-premise and cloud environments of all kinds. For private cloud architectures, consolidation of your databases and data stores is an important step to take to be able to receive the full benefits of cloud computing. Private cloud integration requires bidirectional replication between heterogeneous systems to allow you to perform data consolidation without interrupting your business operations. In addition, integrating data requires bulk load and transformation into and out of your private cloud is a crucial step for those companies moving to private cloud. In addition, the need for managing data services as part of SOA/BPM solutions that enable agile application delivery and help build shared data services for organizations. But what about public Cloud? If you have moved your data to a public cloud application, you may also need to connect your on-premise enterprise systems and the cloud environment by moving data in bulk or as real-time transactions across geographies. For public and private cloud architectures both, Oracle offers a complete and extensible set of integration options that span not only data integration but also service and process integration, security, and management. For those companies investing in Oracle Cloud, you can move your data through Oracle SOA Suite using REST APIs to Oracle Messaging Cloud Service —a new service that lets applications deployed in Oracle Cloud securely and reliably communicate over Java Messaging Service . As an example of loading and transforming data into other public clouds, Oracle Data Integrator supports a knowledge module for Salesforce.com—now available on AppExchange. Other third-party knowledge modules are being developed by customers and partners every day. To learn more about how to leverage Oracle’s Data Integration products for Cloud, join us live: Data Integration Breakthroughs Webcast on October 24th 10 AM PST.

    Read the article

  • Persevering & Friday Night Big Ideas

    - by Oracle Accelerate for Midsize Companies
    by Jim Lein, Oracle Midsize Programs Every successful company, personal accomplishment, and philanthropic endeavor starts with one good idea. I have my best ideas on Friday evenings. The creative side of my brain is stimulated by end of week endorphins. Free thinking. Anything is possible. But, as my kids love to remind me, most of Dad's Friday Night Big Ideas (FNBIs) fizzle on the drawing board. Usually there's one barrier blocking the way that seems insurmountable by noon on Monday. For example, trekking the 486 mile Colorado Trail is on my bucket list. Since I have a job, I'll have to do it in bits and pieces--day hikes, weekends, and a vacation week here and there. With my trick neck, backpacking is not an option. How to survive equip myself for overnight backcountry travel was that one seemingly insurmountable barrier.  Persevering Lewis and Clark wouldn't have given up so I explored options and, as I blogged about back in December, I had an FNBI to hire llamas to carry my load. Last weekend, that idea came to fruition. Early Saturday morning, I met up with Bill, the owner of Antero Llamas, for an overnight training expedition along segment 14 of the Colorado Trail with a string of twelve llamas. It was a crash course on learning how to saddle, load, pasture, and mediate squabbles. Amazingly, we left the trailhead with me, the complete novice, at the lead. Instead of trying to impart three decades of knowledge on me in two days, Bill taught me two things: "Go With the Flow" and "Plan B". It worked. There were times I would be lost in thought for long stretches of time until one snort would remind me that I had a string of twelve llamas trailing behind. A funny thing happened along the trail... Up until last Saturday, my plan had been to trek all 28 segments of the trail east to west and sequentially. Out of some self-imposed sense of decorum. That plan presented myriad logistical challenges such as impassable snow pack on the Continental Divide when segment 6 is up next. On Sunday, as we trekked along the base of 14,000 ft peaks, I applied Bill's llama handling philosophy to my quest and came up with a much more realistic and enjoyable strategy for achieving my goal.  Seize opportunities to hike regardless of order. Define my own segments. Go west to east for awhile if it makes more sense. Let the llamas carry more creature comforts. Chill out.  I will still set foot on all 486 miles of the trail. Technically, the end result will be the same.And I and my traveling companions--human and camelid--will enjoy the journey more. Much more. Got Big Ideas of Your Own? Check out Tongal. This growing Oracle customer works with brands to crowd source fantastic ideas for promoting products and services. Your great idea could earn you cash.  Looking for more news and information about Oracle Solutions for Midsize Companies? Read the latest Oracle for Midsize Companies Newsletter Sign-up to receive the latest communications from Oracle’s industry leaders and experts Jim Lein I evangelize Oracle's enterprise solutions for growing midsize companies. I recently celebrated 15 years with Oracle, having joined JD Edwards in 1999. I'm based in Evergreen, Colorado and love relating stories about creativity and innovation whether they be about software, live music, or the mountains. The views expressed here are my own, and not necessarily those of Oracle.

    Read the article

  • RPi and Java Embedded GPIO: Big Data and Java Technology

    - by hinkmond
    Java Embedded and Big Data go hand-in-hand, especially as demonstrated by prototyping on a Raspberry Pi to show how well the Java Embedded platform can perform on a small embedded device which then becomes the proof-of-concept for industrial controllers, medical equipment, networking gear or any type of sensor-connected device generating large amounts of data. The key is a fast and reliable way to access that data using Java technology. In the previous blog posts you've seen the integration of a static electricity sensor and the Raspberry Pi through the GPIO port, then accessing that data through Java Embedded code. It's important to point out how this works and why it works well with Java code. First, the version of Linux (Debian Wheezy/Raspian) that is found on the RPi has a very convenient way to access the GPIO ports through the use of Linux OS managed file handles. This is key in avoiding terrible and complex coding using register manipulation in C code, or having to program in a less elegant and clumsy procedural scripting language such as python. Instead, using Java Embedded, allows a fast way to access those GPIO ports through those same Linux file handles. Java already has a very easy to program way to access file handles with a high degree of performance that matches direct access of those file handles with the Linux OS. Using the Java API java.io.FileWriter lets us open the same file handles that the Linux OS has for accessing the GPIO ports. Then, by first resetting the ports using the unexport and export file handles, we can initialize them for easy use in a Java app. // Open file handles to GPIO port unexport and export controls FileWriter unexportFile = new FileWriter("/sys/class/gpio/unexport"); FileWriter exportFile = new FileWriter("/sys/class/gpio/export"); ... // Reset the port unexportFile.write(gpioChannel); unexportFile.flush(); // Set the port for use exportFile.write(gpioChannel); exportFile.flush(); Then, another set of file handles can be used by the Java app to control the direction of the GPIO port by writing either "in" or "out" to the direction file handle. // Open file handle to input/output direction control of port FileWriter directionFile = new FileWriter("/sys/class/gpio/gpio" + gpioChannel + "/direction"); // Set port for input directionFile.write("in"); // Or, use "out" for output directionFile.flush(); And, finally, a RandomAccessFile handle can be used with a high degree of performance on par with native C code (only milliseconds to read in data and write out data) with low overhead (unlike python) to manipulate the data going in and out on the GPIO port, while the object-oriented nature of Java programming allows for an easy way to construct complex analytic software around that data access functionality to the external world. RandomAccessFile[] raf = new RandomAccessFile[GpioChannels.length]; ... // Reset file seek pointer to read latest value of GPIO port raf[channum].seek(0); raf[channum].read(inBytes); inLine = new String(inBytes); It's Big Data from sensors and industrial/medical/networking equipment meeting complex analytical software on a small constraint device (like a Linux/ARM RPi) where Java Embedded allows you to shine as an Embedded Device Software Designer. Hinkmond

    Read the article

  • SQL – Download FREE Book – Data Access for HighlyScalable Solutions: Using SQL, NoSQL, and Polyglot Persistence

    - by Pinal Dave
    Recently I was preparing for Big Data and I ended up on very interesting read for everybody. This is created by Microsoft and it is indeed a fantastic read as per my opinion. It took me some time to read this entire book but it was worth reading this as it tried to answer two of the very interesting questions related to muscle. Here is the abstract from the book: Organizations seeking to use a NoSQL database are therefore faced with a twofold challenge: • Which NoSQL database(s) best meet(s) the needs of the organization? • How does an organization integrate a NoSQL database into its solutions? As I keep on reading the book, I find it very interesting and informative. I suggest if you have time this weekend, download the book and read it. This guide focuses on the most common types of NoSQL database currently available, describes the situations for which they are most suited, and shows examples of how you might incorporate them into a business application. The guide summarizes the experiences of a fictitious organization named Adventure Works, who implemented a solution that comprised an assortment of different databases. Download Data Access for HighlyScalable Solutions:  Using SQL, NoSQL,  and Polyglot Persistence While we are talking about Big Data and NoSQL do not forget to check out my tomorrow’s blog as I am going to talk about the same subject and it will be very interesting. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, NoSQL, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • BFCM &ndash; Big Fat Check Mark

    - by onefloridacoder
    I was installing TFS on my local laptop last week and just got around to setting up my initial collection using the TFS Console tool and “Bang!”  I received a message that told me that my local database didn’t have the full-text search option installed.  I remember the option in a (long) list of options and didn’t remember fiddling with it.   Whatever the reason, if you are installing TFS Basic on your box, make sure you have that little check ticked, or you won’t get the big fat one pictured above.  I installed SQL 2008 Developer edition which worked well for what I needed so far, and just needed to run the “Add Feature” option instead of the “Repair” option. HTH

    Read the article

  • How big can my SharePoint 2010 installation be?

    - by Sahil Malik
    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). 3 years ago, I had published “How big can my SharePoint 2007 installation be?” Well, SharePoint 2010 has significant under the covers improvements. So, how big can your SharePoint 2010 installation be? There are three kinds of limits you should know about Hard limits that cannot be exceeded by design. Configurable that are, well configurable – but the default values are set for a pretty good reason, so if you need to tweak, plan and understand before you tweak. Soft limits, you can exceed them, but it is not recommended that you do. Before you read any of the limits, read these two important disclaimers - 1. The limit depends on what you’re doing. So, don’t take the below as gospel, the reality depends on your situation. 2. There are many additional considerations in planning your SharePoint solution scalability and performance, besides just the below. So with those in mind, here goes.   Hard Limits - Zones per web app 5 RBS NAS performance Time to first byte of any response from NAS must be less than 20 milliseconds List row size 8000 bytes driven by how SP stores list items internally Max file size 2GB (default is 50MB, configurable). RBS does not increase this limit. Search metadata properties 10,000 per item crawled (pretty damn high, you’ll never need to worry about it). Max # of concurrent in-memory enterprise content types 5000 per web server, per tenant Max # of external system connections 500 per web server PerformancePoint services using Excel services as a datasource No single query can fetch more than 1 million excel cells Office Web Apps Renders One doc per second, per CPU core, per Application server, limited to a maximum of 8 cores.   Configurable Limits - Row Size Limit 6, configurable via SPWebApplication.MaxListItemRowStorage property List view lookup 8 join operations per query Max number of list items that a single operation can process at one time in normal hours 5000 Configurable via SPWebApplication.MaxItemsPerThrottledOperation   Also you get a warning at 3000, which is configurable via SPWebApplication.MaxItemsPerThrottledOperationWarningLevel   In addition, throttle overrides can be requested, throttle overrides can be disabled, and time windows can be set when throttle is disabled. Max number of list items for administrators that a single operation can process at one time in normal hours 20000 Configurable via SPWebApplication.MaxItemsPerThrottledOperationOverride Enumerating subsites 2000 Word and Powerpoint co-authoring simultaneous editors 10 (Hard limit is 99). # of webparts on a page 25 Search Crawl DBs per search service app 10 Items per crawl db 25 million Search Keywords 200 per site collection. There is a max limit of 5000, which can then be modified by editing the web.config/client.config. Concurrent # of workflows on a content db 15. Workflows running in the timer service are not counted in this limit. Further workflows are queued. Can be configured via the Set-SPFarmConfig powershell commandlet. Number of events picked by the workflow timer job and delivered to workflows 100. You can increase this limit by running additional instances of the workflow timer service. Visio services file size 50MB Visio web drawing recalculation timeout 120 seconds Configurable via – Powershell commandlet Set-SPVisioPerformance Visio services minimum and maximum cache age for data connected diagrams 0 to 24 hours. Default is 60 minutes. Configurable via – Powershell commandlet Set-SPVisioPerformance   Soft Limits - Content Databases 300 per web app Application Pools 10 per web server Managed Paths 20 per web app Content Database Size 200GB per Content DB Size of 1 site collection 100GB # of sites in a site collection 250,000 Documents in a library 30 Million, with nesting. Depends heavily on type and usage and size of documents. Items 30 million. Depends heavily on usage of items. SPGroups one SPUser can be in 5000 Users in a site collection 2 million, depends on UI, nesting, containers and underlying user store AD Principals in a SPGroup 5000 SPGroups in a site collection 10000 Search Service Instances 20 Indexed Items in Search 100 million Crawl Log entries 100 million Search Alerts 1 million per search application Search Crawled Properties 1/2 million URL removals in search 100 removals per operation User Profiles 2 million per service application Social Tags 500 million per social database Comment on the article ....

    Read the article

  • Big label generator

    - by jamiet
    Sometimes I write blog posts mainly so that I can find stuff when I need it later. This is such a blog post. Of late I have been writing lots of deployment scripts and I am fan of putting big labels into deployment scripts (which, these days, reside in SSDT) so one can easily see what’s going on as they execute. Here’s such an example from my current project: which results in this being displayed when the script is run: In case you care….PM_EDW is the name of one of our databases. I’m almost embarrassed to admit that I spent about half an hour crafting that and a few others for my current project because a colleague has just alerted me to a website that would have done it for me, and given me lots of options for how to present it too: http://www.patorjk.com/software/taag/#p=testall&f=Banner3&t=PM__EDW Very useful indeed. Nice one! And yes, I’m sure there are a myriad of sites that do the same thing - I’m a latecomer, ok? @Jamiet

    Read the article

  • Why is trailing whitespace a big deal?

    - by EpsilonVector
    Trailing whitespace is enough of a problem for programmers that editors like Emacs have special functions that highlight it or get rid of it automatically, and many coding standards require you to eliminate all instances of it. I'm not entirely sure why though. I can think of one practical reason of avoiding unnecessary whitespace, and it is that if people are not careful about avoiding it, then they might change it in between commits, and then we get diffs polluted with seemingly unchanged lines, just because someone removed or added a space. This already sounds like a pretty good reason to avoid it, but I do want to see if there's more to it than that. So, why is trailing whitespace such a big deal?

    Read the article

< Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >