Search Results

Search found 1508 results on 61 pages for 'deep b'.

Page 13/61 | < Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >

  • How to do pixel per pixel modeling in unity3d?

    - by Kabumbus
    So generally I want to have api like pixels.addPixel3D(new Pixel3D(0xFF0000, 100, 100,100)); (color, position) where pixels is some abstraction on 3d sceen objet.So to say point cloud. It would have grate use in deep space/stars modeling... I want to set each pixel by hand (having no image base or any automatic thing)... So point is modeling something like Or look at alive flash analog here How to do such thing in unity?

    Read the article

  • Filing the XBRL version of Oracle's 2011 10K with the SEC using Oracle Hyperion Disclosure Management

    Oracle Hyperion Disclosure Management is designed to "demystify" the creation of XBRL documents. Featuring deep integration with existing Oracle financial reporting tools, it is the easiest and most straightforward approach to XBRL reporting for Oracle's enterprise performance management and enterprise resource planning customers. In this podcast hear how Oracle itself has improved its SEC XBRL submission process through the implementation of Oracle Hyperion Disclosure Management.

    Read the article

  • IT Job Titles ? What Do They Mean?

    Although only a few decades old, the information technology or IT field is as broad and deep as industries that have been around for centuries. IT job categories, titles and specialties abound -- so ... [Author: Allen B. Ury - Computers and Internet - March 27, 2010]

    Read the article

  • Accenture completes acquisition of Enkitec

    - by Javier Puerta
    The acquisition of Enkitec by Accenture has been now completed. “This acquisition infuses a level of industry-leading talent into our Oracle Engineered Systems business,” said Derek Steelberg, global managing director of Oracle business for Accenture. “The combination of our infrastructure transformation expertise and Enkitec’s deep Oracle knowledge and skill set means that our clients can more quickly derive greater business value from their investments across the spectrum of Oracle technologies.”  See full Accenture press release with details here.

    Read the article

  • Quicker Loading of Your Website Ensures Improved SE Rankings

    Now that Google has released news regarding the inclusion of website's speed as one of the factors responsible for website search engine ranking, it is obvious that most website owners would get want delve into matter, a bit more deep. According to the statement, it is clear that the websites having lower loading speed would have less probability of making their way to the search engine result pages than those that load up faster.

    Read the article

  • Highlights from Google I/O 2011

    Highlights from Google I/O 2011 Google I/O brings together thousands of developers for two days of deep technical content, focused on building the next generation of web, mobile, and enterprise applications with Google and open web technologies such as Android, Google Chrome, Google APIs, Google Web Toolkit, App Engine, and more. From: GoogleDevelopers Views: 19561 281 ratings Time: 01:54 More in Science & Technology

    Read the article

  • Supermicro motherboards and systems

    - by jchang
    I used to buy SuperMicro exclusively for my own lab. SuperMicro always had a deep lineup of motherboards with almost every conceivable variation. In particular, they had the maximum memory and IO configuration that is desired for database servers. But from around 2006, I became too lazy to source the additional components necessary to complete the system, and switched to Dell PowerEdge Tower servers. Now, I may reconsider as neither Dell or HP are offering the right combination of PCI-E slots. Nor...(read more)

    Read the article

  • Silverlight 4 will launch at DevConnections in Vegas!

    Scott Guthrie will launch Silverlight 4 at the DevConnections/ASP.NET and Silverlight Conference in Vegas on April 12-14. Get three days’ of deep sessions on both Silverlight 4, ASP.NET 4.0, AJAX and MVC2 – and the global Launch of Visual Studio 2010! All at the same conference!

    Read the article

  • SQL Relay 2012

    This May brings SQL Server experts to your locality; full day seminars in Edinburgh, Manchester, Birmingham, Bristol and London. Overview sessions from Microsoft in the morning, Deep Dive sessions in the afternoon from SQL Server MVP's and evening community events. Get smart with SQL Backup ProGet faster, smaller backups with integrated verification.Quickly and easily DBCC CHECKDB your backups. Learn more.

    Read the article

  • How to only render part of an image in lwjgl/openGL

    - by Ephyxia
    I'm making a mining/building game in java using slick2D and I want to make it so you can only see a few blocks in any direction while you are underground. The best example I could find of what I want to do is the game miner dig deep. One way I thought of doing it would be to have a large image and just draw transparent areas on it where you need to be able too see but even if that would be an efficient method I wouldn't be sure how to do that.

    Read the article

  • Running Assemblies in custom application domains

    This sounds like a deep technical thing and internally it's right too. But in a developer prospective, if he needs to run an assembly under a fresh new ApplicationDomain, it's so easy like just use some of the classes defined in System NameSpace. Below I am explaining this with a small console application.

    Read the article

  • Search Engine Optimization (SEO) 101 - Part One

    Search engine optimization is a vital part of doing business online. Without it, your website will just become another statistic, buried deep in Google's search results. However, if you are following proper SEO protocol, your site will rise to the top and become easily discoverable by anyone searching for your keywords.

    Read the article

  • Ubuntu 10.10 Alpha 1 Is Ready for Testing

    <b>Softpedia:</b> "While Ubuntu fans out there still discover and enjoy the brand-new Ubuntu 10.04 LTS (Lucid Lynx) operating system, somewhere deep in the Ubuntu headquarters, the Canonical developers are working on the next major update for their popular Linux distribution."

    Read the article

  • Android real time multiplayer over LAN

    - by Heigo
    I've developed several games for the android platform and now planning to create my first multiplayer game. What I have in mind is basically just a 2-player game witch you can play with 2 phones over local area connection/WiFi. Both phones need to be able to pass 3 integer values to the other phone in real time. So far I have considered using Socket's, but before I dig into it too deep I wanted to ask if there might be a better approach? Thanks!

    Read the article

  • Beyond Cloud Technology, Enabling A More Agile and Responsive Organization

    - by sxkumar
    This is the second part of the blog “Clouds, Clouds Everywhere But not a Drop of Rain”. In the first part,  I was sharing with you how a broad-based transformation makes cloud more than a technology initiative, I will describe in this section how it requires people (organizational) and process changes as well, and these changes are as critical as is the choice of right tools and technology. People: Most IT organizations have a fairly complex organizational structure. There are different groups, managing different pieces of the puzzle, and yet, they don't always work together. Provisioning a new application therefore may require a request to float endlessly through system administrators, DBAs and middleware admin worlds – resulting in long delays and constant finger pointing.  Cloud users expect end-to-end automation - which requires these silos to be greatly simplified, if not completely eliminated.  Most customers I talk to acknowledge this problem but are quick to admit that such a transformation is hard. As hard as it may be, I am afraid that the status quo is no longer an option. Sticking to an organizational structure that was created ages back will not only impede cloud adoption,  it also risks making the IT skills increasingly irrelevant in a world that is rapidly moving towards converged applications and infrastructure.   Process: Most IT organizations today operate with a mindset that they must fully "control" access to any and all types of IT services. This in turn leads to people clinging on to outdated manual approval processes .  While requiring approvals for scarce resources makes sense, insisting that every single request must be manually approved defeats the very purpose of cloud. Not only this causes delays, thereby at least partially negating the agility benefits, it also results in gross inefficiency. In a cloud environment, self-service access should be governed by policies, quotas that the administrators can define upfront . For a cloud initiative to be successful, IT organizations MUST be ready to empower users by giving them real control rather than insisting on brokering every single interaction between users and the cloud resources. Technology: From a technology perspective, cloud is about consolidation, standardization and automation. A consolidated and standardized infrastructure helps increase utilization and reduces cost. Additionally, it  enables a much higher degree of automation - thereby providing users the required agility while minimizing operational costs.  Obviously, automation is the key to cloud. Unfortunately it hasn’t received as much attention within enterprises as it should have.  Many organizations are just now waking up to the criticality of automation and it still often gets relegated to back burner in favor of other "high priority" projects. However, it is important to understand that without the right type and level of automation, cloud will remain a distant dream for most enterprises. This in turn makes the choice of the cloud management software extremely critical.  For a cloud management software to be effective in an enterprise environment, it must meet the following qualifications: Broad and Deep Solution It should offer a broad and deep solution to enable the kind of broad-based transformation we are talking about.  Its footprint must cover physical and virtual systems, as well as infrastructure, database and application tiers. Too many enterprises choose to equate cloud with virtualization. While virtualization is a critical component of a cloud solution, it is just a component and not the whole solution. Similarly, too many people tend to equate cloud with Infrastructure-as-a-Service (IaaS). While it is perfectly reasonable to treat IaaS as a starting point, it is important to realize that it is just the first stepping stone - and on its own it can only provide limited business benefits. It is actually the higher level services, such as (application) platform and business applications, that will bring about a more meaningful transformation to your enterprise. Run and Manage Efficiently Your Mission Critical Applications It should not only be able to run your mission critical applications, it should do so better than before.  For enterprises, applications and data are the critical business assets  As such, if you are building a cloud platform that cannot run your ERP application, it isn't truly a "enterprise cloud".  Also, be wary of  vendors who try to sell you the idea that your applications must be written in a certain way to be able to run on the cloud. That is nothing but a bogus, self-serving argument. For the cloud to be meaningful to enterprises, it should adopt to your applications - and not the other way around.  Automated, Integrated Set of Cloud Management Capabilities At the root of many of the problems plaguing enterprise IT today is complexity. A complex maze of tools and technology, coupled with archaic  processes, results in an environment which is inflexible, inefficient and simply too hard to manage. Management tool consolidation, therefore, is key to the success of your cloud as tool proliferation adds to complexity, encourages compartmentalization and defeats the very purpose that you are building the cloud for. Decision makers ought to be extra cautious about vendors trying to sell them a "suite" of disparate and loosely integrated products as a cloud solution.  An effective enterprise cloud management solution needs to provide a tightly integrated set of capabilities for all aspects of cloud lifecycle management. A simple question to ask: will your environment be more or less complex after you implement your cloud? More often than not, the answer will surprise you.  At Oracle, we have understood these challenges and have been working hard to create cloud solutions that are relevant and meaningful for enterprises.  And we have been doing it for much longer than you may think. Oracle was one of the very first enterprise software companies to make our products available on the Amazon Cloud. As far back as in 2007, we created new cloud solutions such as Cloud Database Backup that are helping customers like Amazon save millions every year.  Our cloud solution portfolio is also the broadest and most deep in the industry  - covering public, private, hybrid, Infrastructure, platform and applications clouds. It is no coincidence therefore that the Oracle Cloud today offers the most comprehensive set of public cloud services in the industry.  And to a large part, this has been made possible thanks to our years on investment in creating cloud enabling technologies. I will dedicated the third and final part of the blog “Clouds, Clouds Everywhere But not a Drop of Rain” to Oracle Cloud Technologies Building Blocks and how they mapped into our vision of Enterprise Cloud. Stay Tuned.

    Read the article

  • How Many Web Pages Should Be Indexed?

    Search engines are crawling websites around the clock for unique web pages and content.Google has always been on the top in indexing deep-links of any website, Google indexed 26 million pages in 1998 and in past 10 years Google have indexed over 1 trillion pages. So, this gives a fair idea that how big this cyber world is.

    Read the article

  • How to Go About Improving Your Search Engine Rankings

    Is your website buried so deep in the low pages of search results that you wonder how anyone will ever discover your site? Chances are, if you haven't optimized your website, you're not ranking in the top 10-20 search engine results for any of the phrases or words that you want your service or product to rank for. To raise your ranking in Google, Bing, Yahoo! and other search engines, you'll need to tweak your website to include keywords that are directly relevant to your site but not too general or too competitive.

    Read the article

  • Register now to a complementary Oracle Health Sciences 3-day workshop on Enterprise Healthcare Analytics training in Dallas, US, Nov 12-14, 2013!

    - by Roxana Babiciu
    Join Oracle Health Sciences for an informative overview for Sales / Business Development and Implementation team members on Oracle Enterprise Healthcare Analytics (EHA). You’ll gain an understanding of the Oracle EHA product strategy, garner a platform overview and hear customer success stories that will enable you in the field. Be ready for technical education and training spanning three days of deep expertise sharing.

    Read the article

  • How Many Web Pages Should Be Indexed?

    Search engines are crawling websites around the clock for unique web pages and content.Google has always been on the top in indexing deep-links of any website, Google indexed 26 million pages in 1998 and in past 10 years Google have indexed over 1 trillion pages. So, this gives a fair idea that how big this cyber world is.

    Read the article

  • Data recovery on a corrupted 3TB disk

    - by Mark K Cowan
    Short version I probably need software to run a deep-scan recovery (ideally on Linux) to find files on NTFS filesystem. The file data is intact, but the references are no longer present. Analogous to recovering data from a "quick-formatted" partition. Hopefully there is a smarter way available than deep-scan, one which would recover filenames and possibly paths. Long version I have a 3TB disk containing a load of backups. Windows 7 SP1 refused to detect the disk when plugged in directly via SATA, so I put it on a USB/SATA adaptor which seemed to work at first. The SATA/USB adaptor probably does not support disks over 2.2TB though. Windows first asked me if I wanted to 'format' the disk, then later showed me most of the contents but some folder were inaccessible. I stupidly decided to run a CHKDSK on my backup disk, which made the folders accessible but also left them empty. I connected this disk via SATA to my main PC (Arch Linux). I tried: testdisk ntfsundelete ntfsfix --no-action (to look for diagnostically relevant faults, disk was "OK" though) to no avail as the files references in the tables had presumably been zeroed out by CHKDSK, rather than using a typical journal'd deletion). If it is useful at all, a majority of the files that I want to recover are JPEG, Photoshop PSD, and MPEG-3/MPEG-4/AVI/MKV files. If worst comes to worst, I'll just design my own sector scanner and use some simple heuristic-driven analysis to recover raw binary blocks of data from the disk which appears to match the structures of the above file types. I am unfamiliar with the exact workings of NTFS but used to be proficient at recovering FAT32 systems with just a hex-editor, so I can provide any useful diagnostic information if you let me know how to find it! My priorities in ascending order of importance for choosing the accepted answer: Restores directory structure Recovers many filenames in addition to the file data Is free / very cheap Runs on Linux Recovers a majority of file data The last point is the most important, but the more of the higher points you match the more rep you'll probably get :)

    Read the article

  • Moving symlinks into a folder based on id3 tags.

    - by Reti
    I'm trying to get my music folder into something sensible. Right now, I have all my music stored in /home/foo so I have all of the albums soft linked to ~/music. I want the structure to be ~/music/<artist>/<album> I've got all of the symlinks into ~/music right now so I just need to get the symlinks into the proper structure. I'm trying to do this by delving into the symlinked album, getting the artist name with id3info. I can do this, but I can't seem to get it to work correctly. for i in $( find -L $i -name "*.mp3" -printf "%h\n") do echo "$i" #testing purposes #find its artist #the stuff after read file just cuts up id3info to get just the artist name #$artist = find -L $i -name "*.mp3" | read file; id3info $file | grep TPE | sed "s|.*: \(.*\)|\1|"|head -n1 #move it to correct artist folder #mv "$i" "$artist" done Now, it does find the correct folder, but every time there is a space in the dir name it makes it a newline. Here's a sample of what I'm trying to do $ ls DJ Exortius/ The Trance Mix 3 Wanderlust - DJ Exortius [TRANCE DEEP VOCAL TECH]@ I'm trying to mv The Trance Mix 3 Wanderlust - DJ Exortius [TRANCE DEEP VOCAL TECH]@ into the real directory DJ Exortius. DJ Exortius already exists, so it's just a matter of moving it into the correct directory that's based on the id3 tag of the mp3 inside. Thanks! PS: I've tried easytag, but when I restructure the album, it moves it from /home/foo which is not what I want.

    Read the article

  • Where in the stack is Software Restriction Policies implemented?

    - by Knox
    I am a big fan of Software Restriction Policies for Microsoft Windows and was recently updating our settings for this. I became curious as to where Microsoft implemented this technology in the stack. I can imagine a very naive implementation being in Windows Explorer where when you double click on an exe or other blocked file type, that Explorer would check against the policy. I call this naive because obviously this wouldn't protect against someone typing something in a CMD window. Or worse, Adobe Reader running an external application. On the other hand, I can imagine that software restriction policies could be implemented deep in the stack almost at the metal. In this case, the low level loader would load into memory the questionable file, but mark the memory in the memory manager as non-executable data. I'm pretty sure that Microsoft did not do the most naive implementation, because if I block Java using a path block, Internet Explorer will crash if it attempts to load Java. Which is what I want. But I'm not sure how deep in the stack it's implemented and any insight would be appreciated.

    Read the article

  • Friday Fun: Spell Blazer

    - by Asian Angel
    Are you ready for some fun and adventure after a long week back at work? This week’s game combines jewel-matching style game play with an RPG story for an awesome mix of fun and fiction. Your goal is to help a young wizard reach the magic academy in Raven as the forces of darkness are building. Spell Blazer The object of the game is to help young Kaven reach the Lightcaster Academy in Raven alive, but he will encounter many dangers along the way. Are you ready to begin the quest? As soon as you click Start Game the intro will automatically begin. If this is your first time playing the game the intro provides a nice background story for the game and what is happening in the game environment. Once you are past the intro, you will see a map of the region with your starting point in the Farmlands, various towns and the roads connecting them, along with your final destination of Raven. Notice that some of the roads are different colors…those colors indicate the “danger levels” for each part of your journey (green = good, yellow = some danger, etc.). To begin your journey click on the Town of Goose with your mouse. You will encounter your first monster part of the way towards Goose. This first round takes you through the game play process step-by-step. Once you have clicked Okay you will see the details about the monster you have just encountered. It is very important that you do not click on Fight! or Flee! until viewing and noting the types of spells that the monster is resistant to or has a weakness against. Choose your spells wisely based on the information provided about the monster. Keep in mind that the healing spell can be very useful depending on the monster you meet and your current health status. Note: Spells shown in order here are Healing, Fireball, Icebolt, & Lightning. Ready to fight! The first battle will also explain how to fight…click Okay to get started. Once the main window is in full view there are details that you need to look at. Beneath each of the combatants you will see the three attacks that each brings to the battle and at the bottom you will see their respective health points. We got lucky and had an Icebolt attack that we could utilize on the first play! Note: You can exchange two squares without making a match in order to try and line up an attack. While it happened too quickly to capture in our screenshot, there will be cool lightning bolt effects shoot out from matched up squares to the opposite combatant. You will also see the amount of damage inflicted from a particular attack on top of the avatars. Victory! Once you have won a round of combat a window will appear showing the amount of gold coins left behind by the monster. When you reach a town you will have the opportunity to stop over and rest or directly continue on with your journey. On to Halgard after a good rest! Play Spell Blazer Latest Features How-To Geek ETC How To Boot 10 Different Live CDs From 1 USB Flash Drive The 20 Best How-To Geek Linux Articles of 2010 The 50 Best How-To Geek Windows Articles of 2010 The 20 Best How-To Geek Explainer Topics for 2010 How to Disable Caps Lock Key in Windows 7 or Vista How to Use the Avira Rescue CD to Clean Your Infected PC The Deep – Awesome Use of Metal Objects as Deep Sea Creatures [Video] Convert or View Documents Online Easily with Zoho, No Account Required Build a Floor Scrubbing Robot out of Computer Fans and a Frisbee Serene Blue Windows Wallpaper for Your Desktop 2011 International Space Station Calendar Available for Download (Free) Ultimate Elimination – Lego Black Ops [Video]

    Read the article

  • A temporary disagreement

    - by Tony Davis
    Last month, Phil Factor caused a furore amongst some MVPs with an article that attempted to offer simple advice to developers regarding the use of table variables, versus local and global temporary tables, in their code. Phil makes clear that the table variables do come with some fairly major limitations.no distribution statistics, no parallel query plans for queries that modify table variables.but goes on to suggest that for reasonably small-scale strategic uses, and with a bit of due care and testing, table variables are a "good thing". Not everyone shares his opinion; in fact, I imagine he was rather aghast to learn that there were those felt his article was akin to pulling the pin out of a grenade and tossing it into the database; table variables should be avoided in almost all cases, according to their advice, in favour of temp tables. In other words, a fairly major feature of SQL Server should be more-or-less 'off limits' to developers. The problem with temp tables is that, because they are scoped either in the procedure or the connection, it is easy to allow them to hang around for too long, eating up precious memory and bulking up the shared tempdb database. Unless they are explicitly dropped, global temporary tables, and local temporary tables created within a connection rather than within a stored procedure, will persist until the connection is closed or, with connection pooling, until the connection is reused. It's also quite common with ASP.NET applications to have connection leaks, as Bill Vaughn explains in his chapter in the "SQL Server Deep Dives" book, meaning that the web page exits without closing the connection object, maybe due to an error condition. This will then hang around in the heap for what might be hours before picked up by the garbage collector. Table variables are much safer in this regard, since they are batch-scoped and so are cleaned up automatically once the batch is complete, which also means that they are intuitive to use for the developer because they conform to scoping rules that are closer to those in procedural code. On the surface then, an ideal way to deal with issues related to tempdb memory hogging. So why did Phil qualify his recommendation to use Table Variables? This is another of those cases where, like scalar UDFs and table-valued multi-statement UDFs, developers can sometimes get into trouble with a relatively benign-looking feature, due to way it's been implemented in SQL Server. Once again the biggest problem is how they are handled internally, by the SQL Server query optimizer, which can make very poor choices for JOIN orders and so on, in the absence of statistics, especially when joining to tables with highly-skewed data. The resulting execution plans can be horrible, as will be the resulting performance. If the JOIN is to a large table, that will hurt. Ideally, Microsoft would simply fix this issue so that developers can't get burned in this way; they've been around since SQL Server 2000, so Microsoft has had a bit of time to get it right. As I commented in regard to UDFs, when developers discover issues like with such standard features, the database becomes an alien planet to them, where death lurks around each corner, and they continue to avoid these "killer" features years after the problems have been eventually resolved. In the meantime, what is the right approach? Is it to say "hammers can kill, don't ever use hammers", or is it to try to explain, as Phil's article and follow-up blog post have tried to do, what the feature was intended for, why care must be applied in its use, and so enable developers to make properly-informed decisions, without requiring them to delve deep into the inner workings of SQL Server? Cheers, Tony.

    Read the article

< Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >