Search Results

Search found 6723 results on 269 pages for 'django models'.

Page 179/269 | < Previous Page | 175 176 177 178 179 180 181 182 183 184 185 186  | Next Page >

  • Entity framework: Need a easy going, clean database migration solution.

    - by user469652
    I'm using entity framework model first development, and I need to do database migration often, The EF database generation power pack doesn't help a lot, because that data migration never worked here. The database migration here I mean, change the model, and then I can update the existing database from the model, but creating a new one. Is there any free of charge tool here invented here yet? Or would this going to be a new feature of next EF release? PS: I love django's ORM.

    Read the article

  • Is there a simple PHP development server?

    - by pinchyfingers
    When writing web apps in Python, it brain dead easy to run a development server. Django and Google App Engine both ship with simple servers. The main feature I'm looking for is no configuration. I want something like the GAE dev server where you just pass the directory of the app as a parameter when the server is started. Is there a reason that this is more difficult with PHP?

    Read the article

  • Speed up PostGreSQL createdb?

    - by John
    Is there a way to speed up PostgreSQL's createdb command? Normally I wouldn't care, but doing unit testing in Django creates a database every time, and it takes about 5 seconds. I'm using openSUSE 11.2 64-bit, PostgreSQL 8.4.2

    Read the article

  • How can I check for unused import in many Python files?

    - by Thierry Lam
    I remember when I was developing in C++ or Java, the compiler usually complains for unused methods, functions or imports. In my Django project, I have a bunch of Python files which have gone through a number of iterations. Some of those files have a few lines of import statement at the top of the page and some of those imports are not used anymore. Is there a way to locate those unused imports besides eyeballing each one of them in each file? All my imports are explicit, I don't usually write from blah import *

    Read the article

  • python check value not in unicode list

    - by John
    Hi, I have a list and a value and want to check if the value is not in the list. list = [u'first record', u'second record'] value = 'first record' if value not in list: do something however this is not working and I think it has something to do with the list values having a u at the start, how can I fix this? And before someone suggests the list is returned from Django queryset so I can't just take the u out of the code :) Thanks

    Read the article

  • Oracle Communications Data Model

    - by jean-pierre.dijcks
    I've mentioned OCDM in previous posts but found the following (see end of the post) podcast on the topic and figured it is worthwhile to spread the news some more. ORetailDM and OCommunicationsDM are the two data models currently available from Oracle. Both are intended to capture: Business best practices and industry knowledge Pre-built advanced analytics intended to predict future events before they happen (like the Churn model shown below) Oracle technology best practices to ensure optimal performance of the model All of this typically comes with a reduced time to implementation, or as the marketing slogan goes, reduced time to value. Here are the links: Podcast on OCDM OTN pages for OCDM and ORDM

    Read the article

  • How do I generate mipmap .png and model .obj files for LibGDX?

    - by John Murdoch
    I'm playing a bit with LibGDX (OpenGL ES 2.0 wrapper for Android). I found some sample code which used prepared files to load models and mipmap textures for them, e.g., at https://github.com/libgdx/libgdx/blob/master/demos/invaders/gdx-invaders/src/com/badlogic/gdxinvaders/RendererGL20.java it reads .obj file for the model and RGB565 format .png file to apply a mipmapped texture to it. What is the best / easiest way for me to create these files? I understand .obj files are generated by a bunch of tools (I installed Blender, Wings3D and Kerkythea so far), but which ones will be the most user friendly for someone unfamiliar with 3D modelling? But more importantly, how do I produce a .png file with the mipmapped texture? The .png I looked at ( https://github.com/libgdx/libgdx/blob/master/demos/invaders/gdx-invaders/data/ship.png ) seems to include different textures for each of the faces, probably created with some tool. I browsed through the menus for the 3 tools I have installed but didn't find an option to export such a .png file. What am I missing?

    Read the article

  • Oracle Open World 2012: SQL Developer Recap

    - by thatjeffsmith
    Last week was the ‘big show’ in San Francisco. I was very happy to meet many of you in person. And many of you had questions – lots of questions! We had full or overflowing rooms for our sessions and hands-on-labs. The SQL Developer ‘booths’ were also slammed several times. So exciting to see so many of YOU excited about SQL Developer. It’s very cool to hear the stories of our tools saving you and your organizations so much time (and money!) Instead of doing a Day 0 – Day 9 recap, I thought I’d share with you the questions that I heard more than once. And just for giggles, I’ll throw in some answers as well So in no particular order… What’s the difference between Oracle SQL Developer & Oracle SQL Developer Data Modeler? Mathematically speaking – two words. But as far as the actual modeling features go, there’s no difference between the two applications. The same ‘code’ or features as it pertains to data modeling and design are in both tools. However, in SQL Developer you have all of the OTHER features fighting for real estate in the UI. So I have a general rule of thumb – if you spend MOST of your time in the database, use SQL Developer. And if you spend most of your time in the data model, run the separate and dedicated program, Oracle SQL Developer Data Modeler. Here’s a couple of screenshots to drive home the UI point: Oracle SQL Developer Oracle SQL Developer Data Modeler running INSIDE of SQL Developer. Notice how the Modeler menu items fold under the file menu? Oracle SQL Developer Data Modeler Easier to navigate and manipulate your models with the stand alone modeler. Just no worksheet to run your ad-hoc queries, etc. Don’t forget you can disable the Data Modeler inside of SQL Developer via the Extensions preference page. How can I model my table partitions? Partitioning is defined via the Physical model. So after you have finished your relational model, you need to generate a physical model. Oracle SQL Developer Data Modeler Physical Model and Partitioning Open the properties for your physical model table. Enable the ‘partitioned’ property. Once you do so, the ‘Partitioning’ page will activate. Lots and lots of partitioning support and options here But what about Interval Partitioning? An extension of range partitioning in 11gR2, we don’t currently support this partitioning scheme in SQL Developer. But we’re working on it! Can SQL Developer ignore column order when comparing models? Yes! After you start a model compare, one of your options is to disregard the order of an attribute or column definition. Tell SQL Developer you don’t care when your column shows up, just as long as it DOES show up. Wow, you got a lot of questions around modeling! Is that normal? Yes! While we appreciate that many folks inherit their applications and associated designs, new applications are being ‘born’ every day. Since both of our tools are free for anyone to design their new Oracle applications with, we attract a fair amount of attention I want to do a Hands On Lab. How do I get your software and instructional guides? Go here. Download VirtualBox. Then download the VB image. Import the appliance. Start it. Connect oracle/oracle on the OEL VM. Click on ‘Start Here’ in the desktop. Follow the instructions. If you need help, ask away! You went too fast in your Tips & Tricks session. Do you have cliff notes? Yes! And you’re SO close to finding them! Just go to my SQL Developer resources page. All of my tips are documented on this blog somewhere. I’ve indexed the most popular ones on the resource page. You can use the Search dialog on the right to find the rest. Or just send me a comment or question, and I’ll do my best to answer them as they come in.

    Read the article

  • Problem converting FBX file into XNB

    - by Dado
    I create a Monogame Content Project to convert assets into XNB. For FBX file without texture there is no problem: the file is correctly converted and when I load XNB into my project everything is ok. The problem occours when i have associated to fbx file a texture map: in this case both FBX and PNG files are converted to XNB but when i try to load these XNB files into my project the following problem occours: "ContentLoadException: Could not load Models/maze1 asset as a non-content file!" Note: maze1 is the XNB file that was converted from FBX. How can I solve this problem? Thank you in advance

    Read the article

  • CakePHP Missing Database Table Error

    - by BRADINO
    I am baking a new project management application at work and added a couple new tables to the database today. When I went into the console to bake the new models, they were not in the list... php /path/cake/console/cake.php bake all -app /path/app/ So I manually typed in the model name and I got a missing database table for model error. I checked and double-checked and the database table was named properly. Turns out that some files inside the /app/tmp/cache/ folder were causing Cake not to recognize that I had added new tables to my database. Once I deleted the cache files cake instantly recognized my new database tables and I was baking away! rm -Rf /path/app/tmp/cache/cake*

    Read the article

  • Problems with graphics of Sony Vaio Z

    - by dpcat237
    Hello, I have problem with my Sony Vaio Z VPCZ1. It has physical selector of GPUs which Linux kernel not detect. So after GRUB I see black display (I tried different distributions of Ubuntu and other Linux OS). I read in Ubuntu 10.10 was solve same problem with hybrid graphics but not in my case ^^ I found solutions (not easy at do) for oldest models. But I'm not expert in Linux and before I prefer ask people with more experience. Somebody can help me? Someone installed Ubuntu in same laptop? PS. for more information I found different webs: http://goo.gl/ktvq Thanks Regards

    Read the article

  • Extending NerdDinner: Adding Geolocated Flair

    - by Jon Galloway
    NerdDinner is a website with the audacious goal of “Organizing the world’s nerds and helping them eat in packs.” Because nerds aren’t likely to socialize with others unless a website tells them to do it. Scott Hanselman showed off a lot of the cool features we’ve added to NerdDinner lately during his popular talk at MIX10, Beyond File | New Company: From Cheesy Sample to Social Platform. Did you miss it? Go ahead and watch it, I’ll wait. One of the features we wanted to add was flair. You know about flair, right? It’s a way to let folks who like your site show it off in their own site. For example, here’s my StackOverflow flair: Great! So how could we add some of this flair stuff to NerdDinner? What do we want to show? If we’re going to encourage our users to give up a bit of their beautiful website to show off a bit of ours, we need to think about what they’ll want to show. For instance, my StackOverflow flair is all about me, not StackOverflow. So how will this apply to NerdDinner? Since NerdDinner is all about organizing local dinners, in order for the flair to be useful it needs to make sense for the person viewing the web page. If someone visits from Egypt visits my blog, they should see information about NerdDinners in Egypt. That’s geolocation – localizing site content based on where the browser’s sitting, and it makes sense for flair as well as entire websites. So we’ll set up a simple little callout that prompts them to host a dinner in their area: Hopefully our flair works and there is a dinner near your viewers, so they’ll see another view which lists upcoming dinners near them: The Geolocation Part Generally website geolocation is done by mapping the requestor’s IP address to a geographic area. It’s not an exact science, but I’ve always found it to be pretty accurate. There are (at least) three ways to handle it: You pay somebody like MaxMind for a database (with regular updates) that sits on your server, and you use their API to do lookups. I used this on a pretty big project a few years ago and it worked well. You use HTML 5 Geolocation API or Google Gears or some other browser based solution. I think those are cool (I use Google Gears a lot), but they’re both in flux right now and I don’t think either has a wide enough of an install base yet to rely on them. You might want to, but I’ve heard you do all kinds of crazy stuff, and sometimes it gets you in trouble. I don’t mean talk out of line, but we all laugh behind your back a bit. But, hey, it’s up to you. It’s your flair or whatever. There are some free webservices out there that will take an IP address and give you location information. Easy, and works for everyone. That’s what we’re doing. I looked at a few different services and settled on IPInfoDB. It’s free, has a great API, and even returns JSON, which is handy for Javascript use. The IP query is pretty simple. We hit a URL like this: http://ipinfodb.com/ip_query.php?ip=74.125.45.100&timezone=false … and we get an XML response back like this… <?xml version="1.0" encoding="UTF-8"?> <Response> <Ip>74.125.45.100</Ip> <Status>OK</Status> <CountryCode>US</CountryCode> <CountryName>United States</CountryName> <RegionCode>06</RegionCode> <RegionName>California</RegionName> <City>Mountain View</City> <ZipPostalCode>94043</ZipPostalCode> <Latitude>37.4192</Latitude> <Longitude>-122.057</Longitude> </Response> So we’ll build some data transfer classes to hold the location information, like this: public class LocationInfo { public string Country { get; set; } public string RegionName { get; set; } public string City { get; set; } public string ZipPostalCode { get; set; } public LatLong Position { get; set; } } public class LatLong { public float Lat { get; set; } public float Long { get; set; } } And now hitting the service is pretty simple: public static LocationInfo HostIpToPlaceName(string ip) { string url = "http://ipinfodb.com/ip_query.php?ip={0}&timezone=false"; url = String.Format(url, ip); var result = XDocument.Load(url); var location = (from x in result.Descendants("Response") select new LocationInfo { City = (string)x.Element("City"), RegionName = (string)x.Element("RegionName"), Country = (string)x.Element("CountryName"), ZipPostalCode = (string)x.Element("CountryName"), Position = new LatLong { Lat = (float)x.Element("Latitude"), Long = (float)x.Element("Longitude") } }).First(); return location; } Getting The User’s IP Okay, but first we need the end user’s IP, and you’d think it would be as simple as reading the value from HttpContext: HttpContext.Current.Request.UserHostAddress But you’d be wrong. Sorry. UserHostAddress just wraps HttpContext.Current.Request.ServerVariables["REMOTE_ADDR"], but that doesn’t get you the IP for users behind a proxy. That’s in another header, “HTTP_X_FORWARDED_FOR". So you can either hit a wrapper and then check a header, or just check two headers. I went for uniformity: string SourceIP = string.IsNullOrEmpty(Request.ServerVariables["HTTP_X_FORWARDED_FOR"]) ? Request.ServerVariables["REMOTE_ADDR"] : Request.ServerVariables["HTTP_X_FORWARDED_FOR"]; We’re almost set to wrap this up, but first let’s talk about our views. Yes, views, because we’ll have two. Selecting the View We wanted to make it easy for people to include the flair in their sites, so we looked around at how other people were doing this. The StackOverflow folks have a pretty good flair system, which allows you to include the flair in your site as either an IFRAME reference or a Javascript include. We’ll do both. We have a ServicesController to handle use of the site information outside of NerdDinner.com, so this fits in pretty well there. We’ll be displaying the same information for both HTML and Javascript flair, so we can use one Flair controller action which will return a different view depending on the requested format. Here’s our general flow for our controller action: Get the user’s IP Translate it to a location Grab the top three upcoming dinners that are near that location Select the view based on the format (defaulted to “html”) Return a FlairViewModel which contains the list of dinners and the location information public ActionResult Flair(string format = "html") { string SourceIP = string.IsNullOrEmpty( Request.ServerVariables["HTTP_X_FORWARDED_FOR"]) ? Request.ServerVariables["REMOTE_ADDR"] : Request.ServerVariables["HTTP_X_FORWARDED_FOR"]; var location = GeolocationService.HostIpToPlaceName(SourceIP); var dinners = dinnerRepository. FindByLocation(location.Position.Lat, location.Position.Long). OrderByDescending(p => p.EventDate).Take(3); // Select the view we'll return. // Using a switch because we'll add in JSON and other formats later. string view; switch (format.ToLower()) { case "javascript": view = "JavascriptFlair"; break; default: view = "Flair"; break; } return View( view, new FlairViewModel { Dinners = dinners.ToList(), LocationName = string.IsNullOrEmpty(location.City) ? "you" : String.Format("{0}, {1}", location.City, location.RegionName) } ); } Note: I’m not in love with the logic here, but it seems like overkill to extract the switch statement away when we’ll probably just have two or three views. What do you think? The HTML View The HTML version of the view is pretty simple – the only thing of any real interest here is the use of an extension method to truncate strings that are would cause the titles to wrap. public static string Truncate(this string s, int maxLength) { if (string.IsNullOrEmpty(s) || maxLength <= 0) return string.Empty; else if (s.Length > maxLength) return s.Substring(0, maxLength) + "..."; else return s; }   So here’s how the HTML view ends up looking: <%@ Page Title="" Language="C#" Inherits="System.Web.Mvc.ViewPage<FlairViewModel>" %> <%@ Import Namespace="NerdDinner.Helpers" %> <%@ Import Namespace="NerdDinner.Models" %> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <title>Nerd Dinner</title> <link href="/Content/Flair.css" rel="stylesheet" type="text/css" /> </head> <body> <div id="nd-wrapper"> <h2 id="nd-header">NerdDinner.com</h2> <div id="nd-outer"> <% if (Model.Dinners.Count == 0) { %> <div id="nd-bummer"> Looks like there's no Nerd Dinners near <%:Model.LocationName %> in the near future. Why not <a target="_blank" href="http://www.nerddinner.com/Dinners/Create">host one</a>?</div> <% } else { %> <h3> Dinners Near You</h3> <ul> <% foreach (var item in Model.Dinners) { %> <li> <%: Html.ActionLink(String.Format("{0} with {1} on {2}", item.Title.Truncate(20), item.HostedBy, item.EventDate.ToShortDateString()), "Details", "Dinners", new { id = item.DinnerID }, new { target = "_blank" })%></li> <% } %> </ul> <% } %> <div id="nd-footer"> More dinners and fun at <a target="_blank" href="http://nrddnr.com">http://nrddnr.com</a></div> </div> </div> </body> </html> You’d include this in a page using an IFRAME, like this: <IFRAME height=230 marginHeight=0 src="http://nerddinner.com/services/flair" frameBorder=0 width=160 marginWidth=0 scrolling=no></IFRAME> The Javascript view The Javascript flair is written so you can include it in a webpage with a simple script include, like this: <script type="text/javascript" src="http://nerddinner.com/services/flair?format=javascript"></script> The goal of this view is very similar to the HTML embed view, with a few exceptions: We’re creating a script element and adding it to the head of the document, which will then document.write out the content. Note that you have to consider if your users will actually have a <head> element in their documents, but for website flair use cases I think that’s a safe bet. Since the content is being added to the existing page rather than shown in an IFRAME, all links need to be absolute. That means we can’t use Html.ActionLink, since it generates relative routes. We need to escape everything since it’s being written out as strings. We need to set the content type to application/x-javascript. The easiest way to do that is to use the <%@ Page ContentType%> directive. <%@ Page Language="C#" Inherits="System.Web.Mvc.ViewPage<NerdDinner.Models.FlairViewModel>" ContentType="application/x-javascript" %> <%@ Import Namespace="NerdDinner.Helpers" %> <%@ Import Namespace="NerdDinner.Models" %> document.write('<script>var link = document.createElement(\"link\");link.href = \"http://nerddinner.com/content/Flair.css\";link.rel = \"stylesheet\";link.type = \"text/css\";var head = document.getElementsByTagName(\"head\")[0];head.appendChild(link);</script>'); document.write('<div id=\"nd-wrapper\"><h2 id=\"nd-header\">NerdDinner.com</h2><div id=\"nd-outer\">'); <% if (Model.Dinners.Count == 0) { %> document.write('<div id=\"nd-bummer\">Looks like there\'s no Nerd Dinners near <%:Model.LocationName %> in the near future. Why not <a target=\"_blank\" href=\"http://www.nerddinner.com/Dinners/Create\">host one</a>?</div>'); <% } else { %> document.write('<h3> Dinners Near You</h3><ul>'); <% foreach (var item in Model.Dinners) { %> document.write('<li><a target=\"_blank\" href=\"http://nrddnr.com/<%: item.DinnerID %>\"><%: item.Title.Truncate(20) %> with <%: item.HostedBy %> on <%: item.EventDate.ToShortDateString() %></a></li>'); <% } %> document.write('</ul>'); <% } %> document.write('<div id=\"nd-footer\"> More dinners and fun at <a target=\"_blank\" href=\"http://nrddnr.com\">http://nrddnr.com</a></div></div></div>'); Getting IP’s for Testing There are a variety of online services that will translate a location to an IP, which were handy for testing these out. I found http://www.itouchmap.com/latlong.html to be most useful, but I’m open to suggestions if you know of something better. Next steps I think the next step here is to minimize load – you know, in case people start actually using this flair. There are two places to think about – the NerdDinner.com servers, and the services we’re using for Geolocation. I usually think about caching as a first attack on server load, but that’s less helpful here since every user will have a different IP. Instead, I’d look at taking advantage of Asynchronous Controller Actions, a cool new feature in ASP.NET MVC 2. Async Actions let you call a potentially long-running webservice without tying up a thread on the server while waiting for the response. There’s some good info on that in the MSDN documentation, and Dino Esposito wrote a great article on Asynchronous ASP.NET Pages in the April 2010 issue of MSDN Magazine. But let’s think of the children, shall we? What about ipinfodb.com? Well, they don’t have specific daily limits, but they do throttle you if you put a lot of traffic on them. From their FAQ: We do not have a specific daily limit but queries that are at a rate faster than 2 per second will be put in "queue". If you stay below 2 queries/second everything will be normal. If you go over the limit, you will still get an answer for all queries but they will be slowed down to about 1 per second. This should not affect most users but for high volume websites, you can either use our IP database on your server or we can whitelist your IP for 5$/month (simply use the donate form and leave a comment with your server IP). Good programming practices such as not querying our API for all page views (you can store the data in a cookie or a database) will also help not reaching the limit. So the first step there is to save the geolocalization information in a time-limited cookie, which will allow us to look up the local dinners immediately without having to hit the geolocation service.

    Read the article

  • Introducing Microsoft SQL Server 2008 R2 - Business Intelligence Samples

    - by smisner
    On April 14, 2010, Microsoft Press (blog | twitter) released my latest book, co-authored with Ross Mistry (twitter), as a free ebook download - Introducing Microsoft SQL Server 2008 R2. As the title implies, this ebook is an introduction to the latest SQL Server release. Although you'll find a comprehensive review of the product's features in this book, you will not find the step-by-step details that are typical in my other books. For those readers who are interested in a more interactive learning experience, I have created two samples file for download: IntroSQLServer2008R2Samples project Sales Analysis workbook Here's a recap of the business intelligence chapters and the samples I used to generate the screen shots by chapter: Chapter 6: Scalable Data Warehousing covers a new edition of SQL Server, Parallel Data Warehouse. Understandably, Microsoft did not ship me the software and hardware to set up my own Parallel Data Warehouse environment for testing purposes and consequently you won't see any screenshots in this chapter. I received a lot of information and a lot of help from the product team during the development of this chapter to ensure its technical accuracy. Chapter 7: Master Data Services is a new component in SQL Server. After you install Master Data Services (MDS), which is a separate installation from SQL Server although it's found on the same media, you can install sample models to explore (which is what I did to create screenshots for the book). To do this, you deploying packages found at \Program Files\Microsoft SQL Server\Master Data Services\Samples\Packages. You will first need to use the Configuration Manager (in the Microsoft SQL Server 2008 R2\Master Data Services program group) to create a database and a Web application for MDS. Then when you launch the application, you'll see a Getting Started page which has a Deploy Sample Data link that you can use to deploy any of the sample packages. Chapter 8: Complex Event Processing is an introduction to another new component, StreamInsight. This topic was way too large to cover in-depth in a single chapter, so I focused on information such as architecture, development models, and an overview of the key sections of code you'll need to develop for your own applications. StreamInsight is an engine that operates on data in-flight and as such has no user interface that I could include in the book as screenshots. The November CTP version of SQL Server 2008 R2 included code samples as part of the installation, but these are not the official samples that will eventually be available in Codeplex. At the time of this writing, the samples are not yet published. Chapter 9: Reporting Services Enhancements provides an overview of all the changes to Reporting Services in SQL Server 2008 R2, and there are many! In previous posts, I shared more details than you'll find in the book about new functions (Lookup, MultiLookup, and LookupSet), properties for page numbering, and the new global variable RenderFormat. I will confess that I didn't use actual data in the book for my discussion on the Lookup functions, but I did create real reports for the blog posts and will upload those separately. For the other screenshots and examples in the book, I have created the IntroSQLServer2008R2Samples project for you to download. To preview these reports in Business Intelligence Development Studio, you must have the AdventureWorksDW2008R2 database installed, and you must download and install SQL Server 2008 R2. For the map report, you must execute the PopulationData.sql script that I included in the samples file to add a table to the AdventureWorksDW2008R2 database. The IntroSQLServer2008R2Samples project includes the following files: 01_AggregateOfAggregates.rdl to illustrate the use of embedded aggregate functions 02_RenderFormatAndPaging.rdl to illustrate the use of page break properties (Disabled, ResetPageNumber), the PageName property, and the RenderFormat global variable 03_DataSynchronization.rdl to illustrate the use of the DomainScope property 04_TextboxOrientation.rdl to illustrate the use of the WritingMode property 05_DataBar.rdl 06_Sparklines.rdl 07_Indicators.rdl 08_Map.rdl to illustrate a simple analytical map that uses color to show population counts by state PopulationData.sql to provide the data necessary for the map report Chapter 10: Self-Service Analysis with PowerPivot introduces two new components to the Microsoft BI stack, PowerPivot for Excel and PowerPivot for SharePoint, which you can learn more about at the PowerPivot site. To produce the screenshots for this chapter, I created the Sales Analysis workbook which you can download (although you must have Excel 2010 and the PowerPivot for Excel add-in installed to explore it fully). It's a rather simple workbook because space in the book did not permit a complete exploration of all the wonderful things you can do with PowerPivot. I used a tutorial that was available with the CTP version as a basis for the report so it might look familiar if you've already started learning about PowerPivot. In future posts, I'll continue exploring the new features in greater detail. If there's any special requests, please let me know! Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Big Data – Role of Cloud Computing in Big Data – Day 11 of 21

    - by Pinal Dave
    In yesterday’s blog post we learned the importance of the NewSQL. In this article we will understand the role of Cloud in Big Data Story What is Cloud? Cloud is the biggest buzzword around from last few years. Everyone knows about the Cloud and it is extremely well defined online. In this article we will discuss cloud in the context of the Big Data. Cloud computing is a method of providing a shared computing resources to the application which requires dynamic resources. These resources include applications, computing, storage, networking, development and various deployment platforms. The fundamentals of the cloud computing are that it shares pretty much share all the resources and deliver to end users as a service.  Examples of the Cloud Computing and Big Data are Google and Amazon.com. Both have fantastic Big Data offering with the help of the cloud. We will discuss this later in this blog post. There are two different Cloud Deployment Models: 1) The Public Cloud and 2) The Private Cloud Public Cloud Public Cloud is the cloud infrastructure build by commercial providers (Amazon, Rackspace etc.) creates a highly scalable data center that hides the complex infrastructure from the consumer and provides various services. Private Cloud Private Cloud is the cloud infrastructure build by a single organization where they are managing highly scalable data center internally. Here is the quick comparison between Public Cloud and Private Cloud from Wikipedia:   Public Cloud Private Cloud Initial cost Typically zero Typically high Running cost Unpredictable Unpredictable Customization Impossible Possible Privacy No (Host has access to the data Yes Single sign-on Impossible Possible Scaling up Easy while within defined limits Laborious but no limits Hybrid Cloud Hybrid Cloud is the cloud infrastructure build with the composition of two or more clouds like public and private cloud. Hybrid cloud gives best of the both the world as it combines multiple cloud deployment models together. Cloud and Big Data – Common Characteristics There are many characteristics of the Cloud Architecture and Cloud Computing which are also essentially important for Big Data as well. They highly overlap and at many places it just makes sense to use the power of both the architecture and build a highly scalable framework. Here is the list of all the characteristics of cloud computing important in Big Data Scalability Elasticity Ad-hoc Resource Pooling Low Cost to Setup Infastructure Pay on Use or Pay as you Go Highly Available Leading Big Data Cloud Providers There are many players in Big Data Cloud but we will list a few of the known players in this list. Amazon Amazon is arguably the most popular Infrastructure as a Service (IaaS) provider. The history of how Amazon started in this business is very interesting. They started out with a massive infrastructure to support their own business. Gradually they figured out that their own resources are underutilized most of the time. They decided to get the maximum out of the resources they have and hence  they launched their Amazon Elastic Compute Cloud (Amazon EC2) service in 2006. Their products have evolved a lot recently and now it is one of their primary business besides their retail selling. Amazon also offers Big Data services understand Amazon Web Services. Here is the list of the included services: Amazon Elastic MapReduce – It processes very high volumes of data Amazon DynammoDB – It is fully managed NoSQL (Not Only SQL) database service Amazon Simple Storage Services (S3) – A web-scale service designed to store and accommodate any amount of data Amazon High Performance Computing – It provides low-tenancy tuned high performance computing cluster Amazon RedShift – It is petabyte scale data warehousing service Google Though Google is known for Search Engine, we all know that it is much more than that. Google Compute Engine – It offers secure, flexible computing from energy efficient data centers Google Big Query – It allows SQL-like queries to run against large datasets Google Prediction API – It is a cloud based machine learning tool Other Players Besides Amazon and Google we also have other players in the Big Data market as well. Microsoft is also attempting Big Data with the Cloud with Microsoft Azure. Additionally Rackspace and NASA together have initiated OpenStack. The goal of Openstack is to provide a massively scaled, multitenant cloud that can run on any hardware. Thing to Watch The cloud based solutions provides a great integration with the Big Data’s story as well it is very economical to implement as well. However, there are few things one should be very careful when deploying Big Data on cloud solutions. Here is a list of a few things to watch: Data Integrity Initial Cost Recurring Cost Performance Data Access Security Location Compliance Every company have different approaches to Big Data and have different rules and regulations. Based on various factors, one can implement their own custom Big Data solution on a cloud. Tomorrow In tomorrow’s blog post we will discuss about various Operational Databases supporting Big Data. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • MVC 2 Client Side Model Validation with ExtJS

    One of the most exciting new features in MVC 2 is "Enhanced Model Validation support across both server and client"; this new enhanced support allows for client side validation to be dynamically generated into a view from DataAnnotations attributes on models. One minor complaint: Out of the box, it only supports the Microsoft AJAX libraries. Good news: It can use other frameworks, and we have done just that! Ext.ux.MvcFormValidator The MvcFormValidator is an alternative form validation...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Modelling photo-realistic grass in realtime

    - by sebf
    Hello, I see a number of tutorials on how to create good looking grasses when creating 3D renders but can't think how to model it for realtime/use in a game's scenery. Sure simple models with alpha cutouts can be used to create plants and trees in really awesome scenery but what about a lawn? Are there any good tricks to achieve this effect? I tried with a simple 4 sided box and a small texture and the number of objects needed for a decent appearance made Max crawl to a halt. (I am thinking it may be possible with a shader but that is a whole other area so thought I would just ask about anyones experience with modelling it here) Thanks!

    Read the article

  • What Agile Model do you use at Work?

    - by Kyle Rozendo
    I am looking to start pushing for more Agile processes to be brought into play in the work place and do my best to outlaw cowboy coding as much as possible. I understand many of the different models and am just looking to see which model has the higher uptake (or which parts of the model as well), and in what industry it is being used. Extreme Programming (XP) Adaptive Software Development (ASD) Scrum Dynamic Systems Development Model (DSDM) Crystal Feature Driven Development (FDD) Lean Software Development (LSD) Agile Modelling (AM) Agile Unified Process (AUP) Kanban If you care to add to your answer with comments about what you don't like, do like or have tried and it hadn't worked, that would also be appreciated.

    Read the article

  • Spotlight on Claims: Serving Customers Under Extreme Conditions

    - by [email protected]
    Oracle Insurance's director of marketing for EMEA, John Sinclair, recently attended the CII Spotlight on Claims event in London. Bad weather and its implications for the insurance industry have become very topical as the frequency and diversity of natural disasters - including rains, wind and snow - has surged across Europe this winter. On England's wettest day on record, the county of Cumbria was flooded with 12 inches of rain within 24 hours. Freezing temperatures wreaked havoc on European travel, causing high speed TVG trains to break down and stranding hundreds of passengers under the English Chanel in a tunnel all night long without heat or electricity. A storm named Xynthia thrashed France and surrounding countries with hurricane force, flooding ports and killing 51 people. After the Spring Equinox, insurers may have thought the worst had past. Then came along Eyjafjallajökull, spewing out vast quantities of volcanic ash in what is turning out to be one of most costly natural disasters in history. Such extreme events challenge insurance companies' ability to service their customers just when customers need their help most. When you add economic downturn and competitive pressures to the mix, insurers are further stretched and required to continually learn and innovate to meet high customer expectations with reduced budgets. These and other issues were hot topics of discussion at the recent "Spotlight on Claims" seminar in London, focused on how weather is affecting claims and the insurance industry. The event was organized by the CII (Chartered Insurance Institute), a group with 90,000 members. CII has been at the forefront in setting professional standards for the insurance industry for over a century. Insurers came to the conference to hear how they could better serve their customers under extreme weather conditions, learn from the experience of their peers, and hear about technological breakthroughs in climate modeling, geographic intelligence and IT. Customer case studies at the conference highlighted the importance of effective and constant communication in handling the overflow of catastrophe related claims. First and foremost is the need to rapidly establish initial communication with claimants to build their confidence in a positive outcome. Ongoing communication then needs to be continued throughout the claims cycle to mange expectations and maintain ownership of the process from start to finish. Strong internal communication to support frontline staff was also deemed critical to successful crisis management, as was communication with the broader insurance ecosystem to tap into extended resources and business intelligence. Advances in technology - such web based systems to access policies and enter first notice of loss in the field - as well as customer-focused self-service portals and multichannel alerts, are instrumental in improving customer satisfaction and helping insurers to deal with the claims surge, which often can reach four or more times normal workloads. Dynamic models of the global climate system can now be used to better understand weather-related risks, and as these models mature it is hoped that they will soon become more accurate in predicting the timing of catastrophic events. Geographic intelligence is also being used within a claims environment to better assess loss reserves and detect fraud. Despite these advances in dealing with catastrophes and predicting their occurrence, there will never be a substitute for qualified front line staff to deal with customers. In light of pressures to streamline efficiency, there was debate as to whether outsourcing was the solution, or whether it was better to build on the people you have. In the final analysis, nearly everybody agreed that in the future insurance companies would have to work better and smarter to keep on top. An appeal was also made for greater collaboration amongst industry participants in dealing with the extreme conditions and systematic stress brought on by natural disasters. It was pointed out that the public oftentimes judged the industry as a whole rather than the individual carriers when it comes to freakish events, and that all would benefit at such times from the pooling of limited resources and professional skills rather than competing in silos for competitive advantage - especially the end customer. One case study that stood out was on how The Motorists Insurance Group was able to power through one of the most devastating catastrophes in recent years - Hurricane Ike. The keys to Motorists' success were superior people, processes and technology. They did a lot of upfront planning and invested in their people, creating a healthy team environment that delivered "max service" even when they were experiencing the same level of devastation as the rest of the population. Processes were rapidly adapted to meet the challenge of the catastrophe and continually adapted to Ike's specific conditions as they evolved. Technology was fundamental to the execution of their strategy, enabling them anywhere access, on the fly reassigning of resources and rapid training to augment the work force. You can learn more about the Motorists experience by watching this video. John Sinclair is marketing director for Oracle Insurance in EMEA. He has more than 20 years of experience in insurance and financial services.

    Read the article

  • What Is the Purpose of the “Do Not Cover This Hole” Hole on Hard Drives?

    - by Jason Fitzpatrick
    From tiny laptop hard drives to beefier desktop models, traditional disk-based hard drives have a very bold warning on them: DO NOT COVER THIS HOLE. What exactly is the hole and what terrible fate would befall you if you covered it? Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-drive grouping of Q&A web sites. How Hackers Can Disguise Malicious Programs With Fake File Extensions Can Dust Actually Damage My Computer? What To Do If You Get a Virus on Your Computer

    Read the article

  • 3DS Max MassFX -- Animation to XNA without bones/skins?

    - by AnKoe
    I made a model in which a number of cobble stones fall into a hole using the rigid bodies in 3DS Max 2012 MassFX. They are just editable polys, no skin, no bone. I want this to play (Take 001, 0-100 frames) when the game loads the mesh. I haven't found a way to get to the animation though. Does anyone have suggestions? All the tutorials for animated skinned models don't seem to work with a model set up like this? Do I really need to give each of 145 rocks a bone? If so, does anyone have a suggestion how to streamline that, or if there is an alternate solution to achieving this effect? The animation only needs to play once when the game starts, and that's it. Thanks.

    Read the article

  • T4Toolbox and Visual Studio 2010

    - by Ben Griswold
    I’ve been using the T4Toolbox to help generate my ASP.NET MVC models and scaffolding for a while now.  Another developer tried using my generator project last week and ran into troubles due to a breaking change around the RenderCore() and TransformText() methods in support for VS 2010.  If you upgraded to the latest version of T4Toolbox and receive a build error similar to the following, you are probably in the same boat: GeneratedTextTransformation.[Template].RenderCore(): no suitable method found to override We took the easy way out.  I had him uninstall the latest version of T4Toolbox and install version 9.7.25.1 which my templates were initially coded against.  For now, that worked great, but it sounds like I’ll be doing some rework of the 20+ templates in my project to support Visual Studio 2010 when we migrate later this month.

    Read the article

  • Polygon count target range for MMO being released in 2 years

    - by classer
    What would a realistic poly count target range be for NPC and player models in a 3D MMO that will be released in 2 years? What about poly count target range for the entire camera view (environment, NPC and player meshes)? I read in some places that one should not aim too low if the game will come out in a couple years because technology is always advancing. If you can give some mesh poly stats on what other current MMOs / MMORPGs are running and future projections, that would be great. Thank you.

    Read the article

  • dual-boot (win-xp/ubu12.04) graphics card for ubu-desktop/win-xp-games

    - by iole1
    for work I need to get a a new and cheap graphics card for a dual boot machine: windows xp/ubuntu 12.04 LTS. The only requirements I have are: it should work 'flawlessly' in ubuntu (proprietary drivers are ok) it should handle Guild Wars 2 & League of Legends in windows xp (this is really the top priority as we need to be able to play at work :) - yes I have a cool job) I know nothing about graphics cards (and it seems to be a jungle out there). From other questions here and some webstigation I think I'd like to go for a Nvidia card, I've been trying to figure out what models fit the system req's but it seems they use different kind of model numbers so I don't get any wiser. tl;dr: will http://www.geforce.co.uk/hardware/desktop-gpus/geforce-gt-620-oem/specifications run Guild Wars 2 http://gamesystemrequirements.com/games.php?id=938 Or what is the worst card from nVidia that will run GW2 smoothly and work well in Ubuntu 12.04 Thanks!

    Read the article

  • assigning values to shader parameters in the XNA content pipeline

    - by Nick
    I have tried creating a simple content processor that assigns the custom effect I created to models instead of the default BasicEffect. [ContentProcessor(DisplayName = "Shadow Mapping Model")] public class ShadowMappingModelProcessor : ModelProcessor { protected override MaterialContent ConvertMaterial(MaterialContent material, ContentProcessorContext context) { EffectMaterialContent shadowMappingMaterial = new EffectMaterialContent(); shadowMappingMaterial.Effect = new ExternalReference<EffectContent>("Effects/MultipassShadowMapping.fx"); return context.Convert<MaterialContent, MaterialContent>(shadowMappingMaterial, typeof(MaterialProcessor).Name); } } This works, but when I go to draw a model in a game, the effect has no material properties assigned. How would I go about assigning, say, my DiffuseColor or SpecularColor shader parameter to white or (better) can I assign it to some value specified by the artist in the model? (I think this may have something to do with the OpaqueDataDictionary but I am confused on how to use it--the content pipeline has always been a black box to me.)

    Read the article

  • What's the real benefit of meta-modeling?

    - by Jakob
    After reading several texts about meta-modeling I still do not really get the practical benefit. Sometimes I think it is only an interesting mind game but no useful tool. Sure it is wise to clarify your modeling vocabulary: some may say class where others say entity or concept, but this is just simple documentation your modeling terminology. Meta-modeling, as I understand it, is more complex, as it tries to formalize and abstract modeling. Some good examples are Keet's formal comparison of conceptual data modeling languages (UML, ERM and ORM) from academia and the Meta Object Facility (MOF) from industry. To me MOF looks as impractical as CORBA, which was also created by OMG. In theory you could use meta-modeling to transform and integrate models in different modeling languages, but is anyone actually doing this?

    Read the article

< Previous Page | 175 176 177 178 179 180 181 182 183 184 185 186  | Next Page >