Search Results

Search found 9104 results on 365 pages for 'head geek'.

Page 10/365 | < Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • Talk About OEM Vender Crapware Overkill! [Humorous Image]

    - by Asian Angel
    It is natural to expect some company-branded crapware on OEM computers, but this is just a bit much! View the Full-Size Version of the Image Getting a new laptop ready for a customer…my god Toshiba, WTF? [via Reddit - Tech Support Gore] Our Geek Trivia App for Windows 8 is Now Available Everywhere How To Boot Your Android Phone or Tablet Into Safe Mode HTG Explains: Does Your Android Phone Need an Antivirus?

    Read the article

  • ASP.NET MVC Head Verb and Selenium RC

    - by Rob
    Selenium (RC) is being used to test an ASP.NET 1.1 site. When we make a request via Selenium RC (and then in turn via Firefox or other configured browser) the http verb is "HEAD". We have several form action methods that have separate POST and GET methods decorated with AcceptVerbs(HttpVerbs.Get). These methods are returning a 404 and logging a "a public action method could not be found" error message. Questions: When writing separate Get/Post action methods what is the best practice for handling the Head verb? Should we always decorate with an AcceptVerbs(HttpVerbs.Get | HttpVerbs.Head)? The only time we've seen HEAD verbs is from Selenium and from some crawlers. We created robots.txt entries for the crawlers. Does anyone know why HEAD verbs are issued via Selenium (instead of an If-Modified-Since header)? Are there mainstream crawlers that use the HEAD verb? Does it affect SEO ranking to not have it?

    Read the article

  • Doug Crockford: Geek of the Week

    Doug Crockford is the man behind JavaScript Object Notation (JSON). He is a well-known critic of XML and guides the development of Javascript on the ECMA Standards Committee, as well as being the senior JavaScript architect at Yahoo! He is also the author of the popular 'JavaScript: The Good Parts'. Richard Morris was dispatched to ask him which the good parts were....Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Geek City: SQL Server 2014 In-Memory OLTP (“Hekaton”) Whitepaper for CTP2

    - by Kalen Delaney
    Last week at the PASS Summit in Charlotte, NC, the update of my whitepaper for CTP2 was released. The manager supervising the paper at Microsoft told me that David DeWitt himself said some very nice things about the technical quality of the paper, which was one of the most ego enhancing compliments I have ever gotten! Unfortunately, Dr. DeWitt said those things at his “After-the-keynote” session, not in the keynote that was recorded, so I only have my manager’s word for it. But I’ll take what I can...(read more)

    Read the article

  • It's Official, I'm a Geek

    - by andyleonard
    I'm honored to join Glen Gordon ( Blog - @glengordon ) and G. Andrew Duthie ( Blog - @devhammer ) today at 3:00 PM EDT for an MSDN Webcast entitled GeekSpeak: Inside SQL Server Integration Services (SSIS). This is a LiveMeeting and you can join in the fun as an attendee here . It's a live show, so bring your questions! :{> Andy Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!...(read more)

    Read the article

  • Triple-head on a Lenovo T520

    - by codeape
    Lenovo T520 with integrated Intel HD graphics + a NVidia card (Optimus) Ubuntu 11.10 on the computer. I would like to use the built-in screen plus two external screens. This PDF indicates that it is possible to connect up to four external monitors to the laptop. The information is Windows only. I was planning to disable the NVidia card, since I have read that Linux support for Optimus is not good. Questions: Has anyone set up three monitors on NVidia hardware? Has anyone set up three monitors using Intel HD 3000? Can I expect it to work out of the box, or are there tricks I need to be aware of?

    Read the article

  • Chuck Esterbrook: Geek of the Week

    The Cobra Programming Language is an exciting new general-purpose Open-source language for .NET or Mono, which features unit tests, contracts, informative asserts, generics, Compile-time nil/null tracking, lambda expressions, closures, list comprehensions and generators. Even if it had been developed by a team, it would have been a remarkable achievement. The surprise is that it is the work of one programmer with help from a group of users. We sent Richard to find out more about that one progra...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • EPM and Business Analytics Talking-head Videos from Oracle OpenWorld 2013

    - by Mike.Hallett(at)Oracle-BI&EPM
    Normal 0 false false false EN-GB X-NONE X-NONE Here is a selection of 2 to 3 minute video interviews at this year’s Oracle OpenWorld: 1. George Somogyi, Solutions Architect, New Edge Group, talks about the importance of having their integrated Oracle Hyperion Platform consisting of Oracle Hyperion Financial Management, Oracle Hyperion Financial Data Quality Management, Oracle E-Business Suite R12 and Oracle Business Intelligence Extended Edition plus their use of Oracle Managed Cloud Services. Speaker: George Somogyi @ http://youtu.be/kWn0dQxCUy8 2. Gregg Thompson, Director of Financial Systems for ADT, talks about using Oracle Data Relationship Management prior to implementing an Enterprise Performance Management solution. Gregg confirmed that there are big benefits to bringing the full Oracle Hyperion Financial Close suite online with Oracle DRM as the metadata source. Reduced maintenance time and use of external consultants translates into significant time and cost savings and faster implementation times. Speaker: Gregg Thompson @ http://youtu.be/XnFrR9Uk4xk 3. Jeff Spangler, Director Financial Planning and Analysis for Speedy Cash Holdings Corp, talked to us about the benefits achieved through implementing Oracle Hyperion Planning and financial reporting solutions. He also describes how the use of Data Relationship Management will keep the process running smoothly now and in the future. Speaker: Jeff Spangler @ http://youtu.be/kkkuMkgJ22U 4. Marc Seewald, Senior Director of Product Management for Oracle Hyperion Tax Provision at Oracle, talks about Oracle Hyperion Tax Provision, how it is an integral part of the financial close process and that it provides better internal controls and automation of this task. Marc talks about Oracle Partners and customers alike who are seeing great value. Speaker: Marc Seewald @ http://youtu.be/lM_nfvACGuA 5. Matt Bradley, SVP of Product Development for Enterprise Performance Management (EPM) Applications at Oracle, talked to us about different deployment options for Oracle EPM. Cloud services (SaaS), managed services, on-premise, off-premise all have their merits, and organizations need flexibility to easily move between them as their companies evolve. Speaker: Matt Bradley @ http://youtu.be/ATO7Z9dbE-o 6. Neil Sellers, Partner, Qubix International talks about their experience with previewing Oracle’s new Planning and Budgeting Cloud Service. He describes the benefits of the step-by-step task lists, the speed of getting the application up and running, and the huge benefits of not having to manage the software and hardware side of the planning process. Speaker: Neil Sellers @ http://youtu.be/xmosO28e4_I 7. Praveen Pasupuleti, Senior Business Intelligence Development Manager of Citrix Systems Inc., talks about their Oracle Hyperion Planning upgrade and the huge performance improvement now experienced in forecasting. He also talked about the benefits of Oracle Hyperion Workforce Planning achieved by Citrix. Speaker: Praveen Pasupuleti @ http://youtu.be/d1e_4hLqw8c 8. CheckPoint Consulting, talked to us about how Enterprise Performance Management should be viewed as an entire solution, rather than as a bunch of applications in silos, to provide significant benefits; and how Data Relationship Management can tie it all together effectively. Speaker: Ron Dimon @ http://youtu.be/sRwbdbbXvUE 9. Sonal Kulkarni, Enterprise Performance Management Leader, Cummins Inc., talks about their use of Oracle Hyperion Financial Close Management (Account Reconciliation Manager), Oracle Hyperion Financial Management and Oracle Hyperion Financial Data Quality Management and how this is providing efficiency, visibility and compliance benefits. Speaker: Sonal Kulkarni @ http://youtu.be/OEgup5dKyVc 10. Todd Renard, Manager Financial Planning and Business Analytics for B/E Aerospace Inc., talks about the huge benefits that B/E Aerospace is experiencing from Oracle Financial Close Suite. He was extremely excited about Oracle Hyperion Financial Data Quality Management and how this helps them integrate a new business in as little as three weeks. Speaker: Todd Renard @ http://youtu.be/nIfqK46uVI8 11. Peter Smolianski, Chief Technology Officer for the District of Columbia Courts, talked to us about how D.C. Courts is using Oracle Scorecard and Strategy Management to push their 5 year plan forward, to report results to their constituents, and take accountability for process changes to become more efficient. Speaker: Peter Smolianski @ http://www.youtube.com/watch?v=T-DtB5pl-uk 12. Rich Wilkie, Senior Director of Product Management for Financial Close Suite at Oracle, talked to us about Oracle Financial Management Analytics. He told us how the prebuilt dashboards on top of Oracle Hyperion Financial Close Suite make it easy for everyone to see the numbers and understand where they are in the close process, and if there is an issue, they can see where it is. Executives are excited to get this information on mobile devices too. Speaker: Rich Wilkie @ http://www.youtube.com/watch?v=4UHuHgx74Yg 13. Dinesh Balebail, Senior Director of Software Development for Oracle Hyperion Profitability and Cost Management, talked to us about the power and speed of Oracle Hyperion Profitability and Cost Management and how it is being used to do deep costing for Telecoms, Hospitals, Banks and other high transaction volume organizations effectively. Speaker: Dinesh Balebail @ http://youtu.be/ivx5AZCXAfs /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman"; mso-ansi-language:EN-US; mso-fareast-language:EN-US;}

    Read the article

  • Geek City: Clearing Plans for a Single Database

    - by Kalen Delaney
    I know Friday afternoon isn't the best time for blogging, as everyone is going home now, and by Monday morning, this post will be old news. But I'm not shutting down just yet, and a something came up this week that I just realized not everybody knew about, so I decided to blog it. Many (or most?) of you are aware that you can clear all cached plans using DBCC FREEPROCCACHE. In addition, there are certain configuration options, for which changing their values will cause all plans in cache to be removed....(read more)

    Read the article

  • Geek City: Growing Rows with Snapshot Isolation

    - by Kalen Delaney
    I just finished a wonderful week in Stockholm, teaching a class for Cornerstone Education. We had 19 SQL Server enthusiasts, all eager to find out everything they could about SQL Server Internals. One questions came up on Thursday that I wasn’t sure of the answer to. I jokingly told the student who asked it to consider it a homework exercise, but then I was so interested in the answer, I try to figure it out myself Thursday evening. In this post, I’ll tell you what I did to try to answer the question....(read more)

    Read the article

  • 4 Geek Excuses for Bad Presentations

    Frustrated by technically interesting yet agonizing conferences, Dr. Masha Petrova leaves geeks with no excuses for making bad presentations, and begins her campaign ensure that the people with good ideas also have good presentation skills to back them up, and get them noticed.

    Read the article

  • Geek City: Where are LOBs stored?

    - by Kalen Delaney
    When researching a question from one of the students in my class last week, I was reading the documentation for CREATE TABLE about storing LOB columns at http://msdn.microsoft.com/en-us/library/ms174979.aspx . For this discussion LOB columns includes text, image, ntext, xml and the MAX columns when they are over 8000 bytes and stored outside the regular data row. I knew that SQL Server gives us the capability of storing LOB columns in a separate filegroup with the TEXTIMAGE_ON clause, but I was surprised...(read more)

    Read the article

  • Geek City: Clearing Plans for a Single Database

    - by Kalen Delaney
    I know Friday afternoon isn't the best time for blogging, as everyone is going home now, and by Monday morning, this post will be old news. But I'm not shutting down just yet, and a something came up this week that I just realized not everybody knew about, so I decided to blog it. Many (or most?) of you are aware that you can clear all cached plans using DBCC FREEPROCCACHE. In addition, there are certain configuration options, for which changing their values will cause all plans in cache to be removed....(read more)

    Read the article

  • Geek City: Preparing for the SQL Server Master Exam

    - by Kalen Delaney
    I was amazed at the results when I just did a search of SQLBlog, and realized no one had really blogged here about the changes to the Microsoft Certified Master (MCM) program. Greg Low described the MCM program when he decided to pursue the MCM at the end of 2008, but two years later, at the end of 2010, Microsoft completely changed the requirements. Microsoft published the new requirements here . The three week intensive course is no longer required, but that doesn't mean you can just buy an exam...(read more)

    Read the article

  • Programming Geek's CV Design

    - by benhowdle89
    What is the best practice or advice for a programmer's CV? If I were a graphic designer or more of a web designer as opposed to a programmer, then I'd imagine a CV would have a bit more flourish to it. Yet as a programmer, aren't we judged on not only efficiency and results but also creativity, imagination and initiative? Should this be reflected in a CV? Or is it all about the information, no wishy washy designs in sight? Has anyone experimented with an original alternative to a PDF CV with a template from MS Word or Mac Pages?

    Read the article

  • Getting My Head Around Immutability

    - by Michael Mangold
    I'm new to object-oriented programming, and one concept that has been taking me a while to grasp is immutability. I think the light bulb went off last night but I want to verify: When I come across statements that an immutable object cannot be changed, I'm puzzled because I can, for instance, do the following: NSString *myName = @"Bob"; myName = @"Mike"; There, I just changed myName, of immutable type NSString. My problem is that the word, "object" can refer to the physical object in memory, or the abstraction, "myName." The former definition applies to the concept of immutability. As for the variable, a more clear (to me) definition of immutability is that the value of an immutable object can only be changed by also changing its location in memory, i.e. its reference (also known as its pointer). Is this correct, or am I still lost in the woods?

    Read the article

  • More free geek-read.December SolidQ Journal is online

    - by Greg Low
    I'm really excited to see the last SolidQ Journal for this year out the door. It's our free online magazine. I've been wondering about the future of printed technical magazines for a long time. I doubt they have much of a future, as online publications become more prevalent and more timely. By the time a print magazine gets to you, it's such a long time since the author wrote the material that it's hard to even retain relevance in a fast moving world. That's why I'm so happy to have the format we...(read more)

    Read the article

  • Head in the Clouds

    - by Tony Davis
    We're just past the second anniversary of the launch of Windows Azure. A couple of years' experience with Azure in the industry has provided some obvious success stories, but has deflated some of the initial marketing hyperbole. As a general principle, Azure seems to work well in providing a Service-Oriented Architecture for services in enterprises that suffer wide fluctuations in demand. Instead of being obliged to provide hardware sufficient for the occasional peaks in demand, one can hire capacity only when it is needed, and the cost of hosting an application is no longer a capital cost. It enables companies to avoid having to scale out hardware for peak periods only to see it underused for the rest of the time. A customer-facing application such as a concert ticketing system, which suffers high demand in short, predictable bursts of activity, is a great example of an application that would work well in Azure. However, moving existing applications to Azure isn't something to be done on impulse. Unless your application is .NET-based, and consists of 'stateless' components that communicate via queues, you are probably in for a lot of redevelopment work. It makes most sense for IT departments who are already deep in this .NET mindset, and who also want 'grown-up' methods of staging, testing, and deployment. Azure fits well with this culture and offers, as a bonus, good Visual Studio integration. The most-commonly stated barrier to porting these applications to Azure is the problem of reconciling the use of the cloud with legislation for data privacy and security. Putting databases in the cloud is a sticky issue for many and impossible for some due to compliance and security issues, the need for direct control over data, and so on. In the face of feedback from the early adopters of Azure, Microsoft has broadened the architectural choices to cater for a wide range of requirements. As well as SQL Azure Database (SAD) and Azure storage, the unstructured 'BLOB and Entity-Attribute-Value' NoSQL storage alternative (which equates more closely with folders and files than a database), Windows Azure offers a wide range of storage options including use of services such as oData: developers who are programming for Windows Azure can simply choose the one most appropriate for their needs. Secondly, and crucially, the Windows Azure architecture allows you the freedom to produce hybrid applications, where only those parts that need cloud-based hosting are deployed to Azure, whereas those parts that must unavoidably be hosted in a corporate datacenter can stay there. By using a hybrid architecture, it will seldom, if ever, be necessary to move an entire application to the cloud, along with personal and financial data. For example that we could port to Azure only put those parts of our ticketing application that capture and process tickets orders. Once an order is captured, the financial side can be processed in our own data center. In short, Windows Azure seems to be a very effective way of providing services that are subject to wide but predictable fluctuations in demand. Have you come to the same conclusions, or do you think I've got it wrong? If you've had experience with Azure, would you recommend it? It would be great to hear from you. Cheers, Tony.

    Read the article

  • Insights From a Non-Geek Working With Technical Developers at a Software Startup

    Everyone is wired differently. Some people are artistic, some are leaders and some are highly technical. Most of the time, it is fairly difficult for these different types of people to communicate effectively and understand each others' limitations and strengths. This can be especially true if you find yourself working as a non-technical employee in a highly technical field such as software development.

    Read the article

  • Mounting ddrescue image after recovery (in over my head)

    - by BorgDomination
    I'm having problems mounting the recovery image. I've tried to mount the image multiple ways. quark@DS9 ~ $ sudo mount -t ext4 /media/jump1/1recover/sdb1.img /mnt mount: wrong fs type, bad option, bad superblock on /dev/loop0, missing codepage or helper program, or other error In some cases useful info is found in syslog - try dmesg | tail or so quark@DS9 ~ $ sudo mount -r -o loop /media/jump1/1recover/sdb1.img recover mount: you must specify the filesystem type quark@DS9 ~ $ sudo mount /media/jump1/1recover/sdb1.img mnt mount: you must specify the filesystem type It doesn't even give me detailed information on the file I just made, nautilus says it's 160gb. quark@DS9 ~ $ file /media/jump1/1recover/sdb1.img /media/jump1/1recover/sdb1.img: data quark@DS9 ~ $ mmls /media/jump1/1recover/sdb1.img Cannot determine partition type I'm not sure what I'm doing wrong or if I started this process incorrectly from the beginning. I've outlined what I've done so far below. I'm clueless, I'd appreciate if someone had some input for me. What I have done from the beginning My laptop has two hard drives. One has the dual boot Win7 / Linux Mint system files. Secondary one contained my /home folder. The laptop was jarred and the /home disk was broken. I tried a LiveCD recovery, it failed. Wouldn't even load a Live session with the disk installed. So I turned to ddrescue. quark@DS9 ~ $ sudo fdisk -l Disk /dev/sda: 160.0 GB, 160041885696 bytes 255 heads, 63 sectors/track, 19457 cylinders, total 312581808 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x0009fc18 Device Boot Start End Blocks Id System /dev/sda1 * 2048 112642047 56320000 7 HPFS/NTFS/exFAT /dev/sda2 138033152 312580095 87273472 83 Linux /dev/sda3 112644094 138033151 12694529 5 Extended /dev/sda5 112644096 132173823 9764864 83 Linux /dev/sda6 132175872 138033151 2928640 82 Linux swap / Solaris Partition table entries are not in disk order Disk /dev/sdb: 160.0 GB, 160041885696 bytes 255 heads, 63 sectors/track, 19457 cylinders, total 312581808 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x0002a8ea Device Boot Start End Blocks Id System /dev/sdb1 * 63 312576704 156288321 83 Linux Disk /dev/sdc: 1000.2 GB, 1000204886016 bytes 255 heads, 63 sectors/track, 121601 cylinders, total 1953525168 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0xed6d054b Device Boot Start End Blocks Id System /dev/sdc1 63 1953520064 976760001 7 HPFS/NTFS/exFAT sda - 160g internal, holds all system files and all computer functions. sdb - 160g internal, BROKEN, contains about 140g of data I'd like to recover. sdc - 1T external, contains recovery image. Only place that has space to do all this. From this site, https://apps.education.ucsb.edu/wiki/Ddrescue I used this script to create an image of the broken hard drive. I changed the destination to the external USB drive. #!/bin/sh prt=sdb1 src=/dev/$prt dst=/media/jump1/1recover/$prt.img log=$dst.log sudo time ddrescue --no-split $src $dst $log sudo time ddrescue --direct --max-retries=3 $src $dst $log sudo time ddrescue --direct --retrim --max-retries=3 $src $dst $log Everything looked like it came off without a hitch: quark@DS9 ~ $ sudo bash recover1 Press Ctrl-C to interrupt Initial status (read from logfile) rescued: 0 B, errsize: 0 B, errors: 0 Current status rescued: 160039 MB, errsize: 4096 B, current rate: 35588 B/s ipos: 3584 B, errors: 1, average rate: 22859 kB/s opos: 3584 B, time from last successful read: 0 s Finished 12.78user 1060.42system 1:56:41elapsed 15%CPU (0avgtext+0avgdata 4944maxresident)k 312580958inputs+0outputs (1major+601minor)pagefaults 0swaps Press Ctrl-C to interrupt Initial status (read from logfile) rescued: 160039 MB, errsize: 4096 B, errors: 1 Current status rescued: 160039 MB, errsize: 1024 B, current rate: 0 B/s ipos: 1536 B, errors: 1, average rate: 13 B/s opos: 1536 B, time from last successful read: 1.3 m Finished 0.00user 0.00system 3:43.95elapsed 0%CPU (0avgtext+0avgdata 4944maxresident)k 238inputs+0outputs (3major+374minor)pagefaults 0swaps Press Ctrl-C to interrupt Initial status (read from logfile) rescued: 160039 MB, errsize: 1024 B, errors: 1 Current status rescued: 160039 MB, errsize: 1024 B, current rate: 0 B/s ipos: 1536 B, errors: 1, average rate: 0 B/s opos: 1536 B, time from last successful read: 3.7 m Finished 0.00user 0.00system 3:43.56elapsed 0%CPU (0avgtext+0avgdata 4944maxresident)k 8inputs+0outputs (0major+376minor)pagefaults 0swaps It looks like, from where I'm standing it worked perfectly. Here's the log: # Rescue Logfile. Created by GNU ddrescue version 1.14 # Command line: ddrescue --direct --retrim --max-retries=3 /dev/sdb1 /media/jump1/1recover/sdb1.img /media/jump1/1recover/sdb1.img.log # current_pos current_status 0x00000600 + # pos size status 0x00000000 0x00000400 + 0x00000400 0x00000400 - 0x00000800 0x254314FC00 + I'm not sure how to proceed. Does this mean all of my data is lost???????? Appreciate ANY input!

    Read the article

< Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >