Search Results

Search found 30252 results on 1211 pages for 'network programming'.

Page 1062/1211 | < Previous Page | 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069  | Next Page >

  • Macbook (non-pro, late 2010) Power Management Issues relating to Nvidia-Current drivers

    - by gbvitaobscura
    I have tried many potential solutions but to no avail! (this is going to be a long post) Essentially, when I first install ubuntu (I have tested 10.11-12.04 beta) I can change the brightness of the macbook backlight. One of the first things I usually do once my machine finishes installing is enter "sudo apt-get install nvidia-current" into the terminal to get my Nvidia graphics card set up. Before I install nvidia-current power management works flawlessly. After installing nvidia-current my graphics driver works mostly properly (more on that later). When I press F1 or F2 notify osd pops up and shows icon for the backlight along with a meter which changes according to how much I press F1 or F2, but the backlight does not change brightness at all. Two supplementary facts: when I am running off of battery power the backlight does not dim ever, also changing the backlight brightness through the terminal does not work (it recognizes the command but changes nothing). Things which did not work: 1. editing xorg.conf 2. python script/hack 3. editing grub Things which did work: 1. removing Nvidia-Current Final piece of information: Although my graphics card works in every way but Power Management it can not be found through System Settings/Details (Graphics Unknown) I'm using the NVIDIA GeForce 320M. the output of lspci is: 00:00.0 Host bridge: NVIDIA Corporation MCP89 HOST Bridge (rev a1) 00:00.1 RAM memory: NVIDIA Corporation MCP89 Memory Controller (rev a1) 00:01.0 RAM memory: NVIDIA Corporation Device 0d6d (rev a1) 00:01.1 RAM memory: NVIDIA Corporation Device 0d6e (rev a1) 00:01.2 RAM memory: NVIDIA Corporation Device 0d6f (rev a1) 00:01.3 RAM memory: NVIDIA Corporation Device 0d70 (rev a1) 00:02.0 RAM memory: NVIDIA Corporation Device 0d71 (rev a1) 00:02.1 RAM memory: NVIDIA Corporation Device 0d72 (rev a1) 00:03.0 ISA bridge: NVIDIA Corporation MCP89 LPC Bridge (rev a2) 00:03.1 RAM memory: NVIDIA Corporation MCP89 Memory Controller (rev a1) 00:03.2 SMBus: NVIDIA Corporation MCP89 SMBus (rev a1) 00:03.3 RAM memory: NVIDIA Corporation MCP89 Memory Controller (rev a1) 00:03.4 Co-processor: NVIDIA Corporation MCP89 Co-Processor (rev a1) 00:04.0 USB controller: NVIDIA Corporation MCP89 OHCI USB 1.1 Controller (rev a1) 00:04.1 USB controller: NVIDIA Corporation MCP89 EHCI USB 2.0 Controller (rev a2) 00:06.0 USB controller: NVIDIA Corporation MCP89 OHCI USB 1.1 Controller (rev a1) 00:06.1 USB controller: NVIDIA Corporation MCP89 EHCI USB 2.0 Controller (rev a2) 00:08.0 Audio device: NVIDIA Corporation MCP89 High Definition Audio (rev a2) 00:09.0 Ethernet controller: NVIDIA Corporation MCP89 Ethernet (rev a1) 00:0a.0 IDE interface: NVIDIA Corporation MCP89 SATA Controller (rev a2) 00:0b.0 RAM memory: NVIDIA Corporation Device 0d75 (rev a1) 00:15.0 PCI bridge: NVIDIA Corporation Device 0d9b (rev a1) 00:17.0 PCI bridge: NVIDIA Corporation MCP89 PCI Express Bridge (rev a1) 01:00.0 Network controller: Broadcom Corporation BCM43224 802.11a/b/g/n (rev 01) 02:00.0 VGA compatible controller: NVIDIA Corporation Device 08a0 (rev a2) There is a very similar bug on Launchpad which I have posted below. https://bugs.launchpad.net/ubuntu/+source/nvidia-graphics-drivers/+bug/764195

    Read the article

  • Building Private IaaS with SPARC and Oracle Solaris

    - by ferhat
    A superior enterprise cloud infrastructure with high performing systems using built-in virtualization! We are happy to announce the expansion of Oracle Optimized Solution for Enterprise Cloud Infrastructure with Oracle's SPARC T-Series servers and Oracle Solaris.  Designed, tuned, tested and fully documented, the Oracle Optimized Solution for Enterprise Cloud Infrastructure now offers customers looking to upgrade, consolidate and virtualize their existing SPARC-based infrastructure a proven foundation for private cloud-based services which can lower TCO by up to 81 percent(1). Faster time to service, reduce deployment time from weeks to days, and can increase system utilization to 80 percent. The Oracle Optimized Solution for Enterprise Cloud Infrastructure can also be deployed at up to 50 percent lower cost over five years than comparable alternatives(2). The expanded solution announced today combines Oracle’s latest SPARC T-Series servers; Oracle Solaris 11, the first cloud OS; Oracle VM Server for SPARC, Oracle’s Sun ZFS Storage Appliance, and, Oracle Enterprise Manager Ops Center 12c, which manages all Oracle system technologies, streamlining cloud infrastructure management. Thank you to all who stopped by Oracle booth at the CloudExpo Conference in New York. We were also at Cloud Boot Camp: Building Private IaaS with Oracle Solaris and SPARC, discussing how this solution can maximize return on investment and help organizations manage costs for their existing infrastructures or for new enterprise cloud infrastructure design. Designed, tuned, and tested, Oracle Optimized Solution for Enterprise Cloud Infrastructure is a complete cloud infrastructure or any virtualized environment  using the proven documented best practices for deployment and optimization. The solution addresses each layer of the infrastructure stack using Oracle's powerful SPARC T-Series as well as x86 servers with storage, network, virtualization, and management configurations to provide a robust, flexible, and balanced foundation for your enterprise applications and databases.  For more information visit Oracle Optimized Solution for Enterprise Cloud Infrastructure. Solution Brief: Accelerating Enterprise Cloud Infrastructure Deployments White Paper: Reduce Complexity and Accelerate Enterprise Cloud Infrastructure Deployments Technical White Paper: Enterprise Cloud Infrastructure on SPARC (1) Comparison based on current SPARC server customers consolidating existing installations including Sun Fire E4900, Sun Fire V440 and SPARC Enterprise T5240 servers to latest generation SPARC T4 servers. Actual deployments and configurations will vary. (2) Comparison based on solution with SPARC T4-2 servers with Oracle Solaris and Oracle VM Server for SPARC versus HP ProLiant DL380 G7 with VMware and Red Hat Enterprise Linux and IBM Power 720 Express - Power 730 Express with IBM AIX Enterprise Edition and Power VM.

    Read the article

  • A Slice of Raspberry Pi

    - by Phil Factor
    Guest editorial for the ITPro/SysAdmin newsletter The Raspberry Pi Foundation has done a superb design job on their new $35 network-enabled Linux computer. This tiny machine, incorporating an ARM processor on a Broadcom BCM2835 multimedia chip, aims to put the fun back into learning computing. The public response has been overwhelmingly positive.Note that aim: "…to put the fun back". Education in Information Technology is in dire straits. It always has been, but seems to have deteriorated further still, even in the face of improved provision of equipment.In many countries, the government controls the curriculum. It predicted a shortage in office-based IT skills, and so geared the ICT curriculum toward mind-numbing training in word-processing and spreadsheet skills. Instead, the shortage has turned out to be in people with an engineering-mindset, who can solve problems with whatever technologies are available and learn new techniques quickly, in a rapidly-changing field.In retrospect, the assumption that specific training was required rather than an education was an idiotic response to the arrival of mainstream information technology. As a result, ICT became a disaster area, which discouraged a generation of youngsters from a career in IT, and thereby led directly to the shortage of people with the skills that are required to exploit the potential of Information Technology..Raspberry Pi aims to reverse the trend. This is a rig that is geared to fast graphics in high resolution. It is no toy. It should be a superb games machine. However, the use of Fedora, Debian, or Arch Linux ARM shows the more serious educational intent behind the Foundation's work. It looks like it will even do some office work too!So, get hold of any power supply that provides a 5VDC source at the required 700mA; an old Blackberry charger will do or, alternatively, it will run off four AA cells. You'll need a USB hub to support the mouse and keyboard, and maybe a hard drive. You'll want a DVI monitor (with audio out) or TV (sound and video). You'll also need to be able to cope with wired Ethernet 10/100, if you want networking.With this lot assembled, stick the paraphernalia on the back of the HDTV with Blu Tack, get a nice keyboard, and you have a classy Linux-based home computer. The major cost is in the T.V and the keyboard. If you're not already writing software for this platform, then maybe, at a time when some countries are talking of orders in the millions, you should consider it.

    Read the article

  • The theory of evolution applied to software

    - by Michel Grootjans
    I recently realized the many parallels you can draw between the theory of evolution and evolving software. Evolution is not the proverbial million monkeys typing on a million typewriters, where one of them comes up with the complete works of Shakespeare. We would have noticed by now, since the proverbial monkeys are now blogging on the Internet ;-) One of the main ideas of the theory of evolution is the balance between random mutations and natural selection. Random mutations happen all the time: millions of mutations over millions of years. Most of them are totally useless. Some of them are beneficial to the evolved species. Natural selection favors the beneficially mutated species. Less beneficial mutations die off. The mutated rabbit doesn't have to be faster than the fox. It just has to be faster than the other rabbits.   Theory of evolution Evolving software Random mutations happen all the time. Most of these mutations are so bad, the new species dies off, or cannot reproduce. Developers write new code all the time. New ideas come up during the act of writing software. The really bad ones don't get past the stage of idea. The bad ones don't get committed to source control. Natural selection favors the beneficial mutated species Good ideas and new code gets discussed in group during informal peer review. Less than good code gets refactored. Enhanced code makes it more readable, maintainable... A good set of traits makes the species superior to others. It becomes widespread A good design tends to make it easier to add new features, easier to understand the current implementations, easier to optimize for performance...thus superior. The best designs get carried over from project to project. They appear in blogs, articles and books about principles, patterns and practices.   Of course the act of writing software is deliberate. This can hardly be called random mutations. Though it sometimes might seem that code evolves through a will of its own ;-) Does this mean that evolving software (evolution) is better than a big design up front (creationism)? Not necessarily. It's a false idea to think that a project starts from scratch and everything evolves from there. Everyone carries his experience of what works and what doesn't. Up front design is necessary, but is best kept simple and minimal, just enough to get you started. Let the good experiences and ideas help to drive the process, whether they come from you or from others, from past experience or from the most junior developer on your team. Once again, balance is the keyword. Balance design up front with evolution on a daily basis. How do you know what balance is right? Through your own experience of what worked and what didn't (here's evolution again). Notes: The evolution of software can quickly degenerate without discipline. TDD is a discipline that leaves little to chance on that part. Write your test to describe the new behavior. Write just enough code to make it behave as specified. Refactor to evolve the code to a higher standard. The responsibility of good design rests continuously on each developers' shoulders. Promiscuous pair programming helps quickly spreading the design to the whole team.

    Read the article

  • Watch 53rd Grammy Awards 2011 Live Stream & Follow The Event On Facebook & Twitter

    - by Gopinath
    Grammy Awards is the biggest music awards ceremony that honours music genres. The 53rd Grammy Awards function will be held on February 13, 2011 at Staples Center in Los Angeles. The star studded musical event will be telecasted live on CBS TV between 8pm ET until 11:30pm ET. Behind The Scenes Live Stream On YouTube and Grammy.com Grammy, YouTube and TurboTax has teamed up to provide live stream of behind the scene coverage of key events starting at 5pm Feb 11th and runs through February 13th. Note that this stream will not broadcast the 3 hour long award presentation. Kind of bad, except the award presentation rest of the ceremony is streamed here.  The video stream is embedded below Catch live behind-the-scenes coverage of key events during GRAMMY week. Before the show, you can watch the Nominees Reception, Clive Davis Pre-GRAMMY Gala, the red carpet and the pre-telecast ceremony. During the show, go backstage to see the winners after they accept their award! You can catch the same stream on Grammys’s YouTube channel as well as on Grammy.com website Follow Grammy Awards On Facebook & Twitter The Grammy Awards has an official Facebook and Twitter account and you can follow them for all the latest information The Grammy’s Facebook Page The Grammy’s Twitter Page Grammy Live Streaming on UStream & Justin.tv Even though there is no official source that stream live of The Grammy Awards presentation ceremony, there will be plenty of unofficial sources on UStream and Justin.tv. So search UStream and Justin.tv for live streaming of The Grammy’s. TV Channels Telecasting The Grammy Awards 2011 Here is the list of TV channels in various countries offering live telecast of The Grammy Awards 2011. We keep updating this section as and when we get more information USA – CBS Television Network  – February 13, 2011 India – VH1 TV – February 14  2011 , 6:30 AM United Kingdom – ITV2  – 16 February 2011, 10:00PM – 12:00AM This article titled,Watch 53rd Grammy Awards 2011 Live Stream & Follow The Event On Facebook & Twitter, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • Professional Developers, may I join you?

    - by Ben
    I currently work in technical support for a software/hardware company and for the most part it's a good job, but it's feeling more and more like I'm getting 'stuck' here. No raises in the 5 years I've been here, and lately there seems to be more hiring from the outside than promotion from within. The work I do is more technical than end-user support, as we deal primarily with our field technicians who have a little more technical skill than the general user base. As a result I get into much more technical support issues... often tracking down bugs in our software, finding performance bottlenecks in our database schema, etc. The work I'm most proud of are the development projects I've come up with on my own, and worked on during lunch breaks and slow periods in Support. Over the years I've written a number of useful utilities for the company. Diagnostic type applications that several departments use and appreciate. These include apps that simulate our various hardware devices, log file analysis, time-saving utilities for our work processes, etc. My best projects have been the hardware simulation programs, which are the type of thing we probably wouldn't have put a full-time developer on had anyone thought to do it, but they've ended up being popular and useful enough to be used by development, QA, R&D, and Support. They allow us to interface our software with simulated hardware, rather than clutter up our work areas with bulky, hard to acquire equipment. Since starting here my life has moved forward (married, kid, one more on the way), but it feels like my career has not. I still earn what I earned walking in the door my first day. Company budget is tight, bonuses have gone down, and no raises or cost of living / inflation adjustments either. As the sole source of income for my family I feel I need to do more, and I'd like to have a more active role in creating something at work, not just cleaning up other people's mistakes. I enjoy technical work, and I think development is the next logical step in my career. I'd like to bring some "legitimacy" to my part-time development work, and make myself a more skilled and valuable employee. Ultimately if this can help me better support my family, that would be ideal. Can I make the jump to professional developer? I have an engineering degree, but no formal education in computer science. I write WinForms apps using the .NET framework, do some freelance web development, have volunteered to write software for a nonprofit, and have started experimenting with programming microcontrollers. I enjoy learning new things in the limited free time I have available. I think I have the aptitude to take on a development role, even in an 'apprentice' capacity if such an option is possible. Have any of you moved into development like this? Do any of you developers have any advice or cautionary tales? Are there better career options I haven't thought of? I welcome any and all related comments and thank you in advance for posting them.

    Read the article

  • Upgrading Code from 2007 to 2010

    - by MOSSLover
    So I’ve been doing some upgrades just to see if things will work from 2007 to 2010.  So far most of the stuff I want works, but obviously there are some things that break.  Did you guys know that in 2007 you could add a webpart to the view pages for lists and libraries without losing the toolbar?  In 2010 the ribbon disappears every time you add a webpart.  So if you are using Scot Hillier’s Codeplex project to hide buttons it will not work the same way, because the ribbon is going to disappear altogether. I have also learned another reason why standalone installations are the bane of my existence.  Nine times out of ten the installation is done using Network Service as the application pool account.  You are wondering why is this bad?  Well, let’s just say the site collection administrator with local admin rights wants to attach the IIS Worker process and debug say a webpart.  Visual Studio 2010 will throw a nasty error that tells you that you are not an administrator.  You will say, but I am an administrator?  I have all the correct group permissions on the server and on SQL and in SharePoint.  Then you will go in and decide let’s add my own admin account just to see if I can attach the debugger and you will notice that works properly.  So the morale of the story is create a separate account on your development environment to run all the SharePoint Services and such.  You don’t need to go all out and create the best practices amount of accounts if it’s just your dev environment.  I would at least create one single account to run all your SharePoint process (Services, SQL, and App Pool).  Also, don’t run a standalone install unless you want to kill kittens (this is a quote from Todd Klindt).  We love kittens they are cute and awesome.  Besides you learn more if you click Complete and just skip standalone.  You will learn how to setup SQL Server 2008 and you will learn how to configure your environment.  It will help you in the long run.  So I have ranted enough for today I figure these are enough tidbits for you this time around.  The two of you who read my blog and I know some of you are friends who don’t understand SharePoint.  I might as well have just done “wahwahwahwah” in Charlie Brown adult speak.  Thanks for reading as usual.  I’ll catch you all when I complain more about the upgrade process and share more tidbits, which will inevitably become a presentation at a conference or two. Technorati Tags: Upgrade Code SharePoint 2007 to 2010,Visuaul Studio 2010,SharePoint 2010

    Read the article

  • C#: A "Dumbed-Down" C++?

    - by James Michael Hare
    I was spending a lovely day this last weekend watching my sons play outside in one of the better weekends we've had here in Saint Louis for quite some time, and whilst watching them and making sure no limbs were broken or eyes poked out with sticks and other various potential injuries, I was perusing (in the correct sense of the word) this month's MSDN magazine to get a sense of the latest VS2010 features in both IDE and in languages. When I got to the back pages, I saw a wonderful article by David S. Platt entitled, "In Praise of Dumbing Down"  (msdn.microsoft.com/en-us/magazine/ee336129.aspx).  The title captivated me and I read it and found myself agreeing with it completely especially as it related to my first post on divorcing C++ as my favorite language. Unfortunately, as Mr. Platt mentions, the term dumbing-down has negative connotations, but is really and truly a good thing.  You are, in essence, taking something that is extremely complex and reducing it to something that is much easier to use and far less error prone.  Adding safeties to power tools and anti-kick mechanisms to chainsaws are in some sense "dumbing them down" to the common user -- but that also makes them safer and more accessible for the common user.  This was exactly my point with C++ and C#.  I did not mean to infer that C++ was not a useful or good language, but that in a very high percentage of cases, is too complex and error prone for the job at hand. Choosing the correct programming language for a job is a lot like choosing any other tool for a task.  For example: if I want to dig a French drain in my lawn, I can attempt to use a huge tractor-like backhoe and the job would be done far quicker than if I would dig it by hand.  I can't deny that the backhoe has the raw power and speed to perform.  But you also cannot deny that my chances of injury or chances of severing utility lines or other resources climb at an exponential rate inverse to the amount of training I may have on that machinery. Is C++ a powerful tool?  Oh yes, and it's great for those tasks where speed and performance are paramount.  But for most of us, it's the wrong tool.  And keep in mind, I say this even though I have 17 years of experience in using it and feel myself highly adept in utilizing its features both in the standard libraries, the STL, and in supplemental libraries such as BOOST.  Which, although greatly help with adding powerful features quickly, do very little to curb the relative dangers of the language. So, you may say, the fault is in the developer, that if the developer had some higher skills or if we only hired C++ experts this would not be an issue.  Now, I will concede there is some truth to this.  Obviously, the higher skilled C++ developers you hire the better the chance they will produce highly performant and error-free code.  However, what good is that to the average developer who cannot afford a full stable of C++ experts? That's my point with C#:  It's like a kinder, gentler C++.  It gives you nearly the same speed, and in many ways even more power than C++, and it gives you a much softer cushion for novices to fall against if they code less-than-optimally.  A bug is a bug, of course, in any language, but C# does a good job of hiding and taking on the task of handling almost all of the resource issues that make C++ so tricky.  For my money, C# is much more maintainable, more feature-rich, second only slightly in performance, faster to market, and -- last but not least -- safer and easier to use.  That's why, where I work, I much prefer to see the developers moving to C#.  The quantity of bugs is much lower, and we don't need to hire "experts" to achieve the same results since the language itself handles those resource pitfalls so prevalent in poorly written C++ code.  C++ will still have its place in the world, and I'm sure I'll still use it now and again where it is truly the correct tool for the job, but for nearly every other project C# is a wonderfully "dumbed-down" version of C++ -- in the very best sense -- and to me, that's the smart choice.

    Read the article

  • Big Data – How to become a Data Scientist and Learn Data Science? – Day 19 of 21

    - by Pinal Dave
    In yesterday’s blog post we learned the importance of the analytics in Big Data Story. In this article we will understand how to become a Data Scientist for Big Data Story. Data Scientist is a new buzz word, everyone seems to be wanting to become Data Scientist. Let us go over a few key topics related to Data Scientist in this blog post. First of all we will understand what is a Data Scientist. In the new world of Big Data, I see pretty much everyone wants to become Data Scientist and there are lots of people I have already met who claims that they are Data Scientist. When I ask what is their role, I have got a wide variety of answers. What is Data Scientist? Data scientists are the experts who understand various aspects of the business and know how to strategies data to achieve the business goals. They should have a solid foundation of various data algorithms, modeling and statistics methodology. What do Data Scientists do? Data scientists understand the data very well. They just go beyond the regular data algorithms and builds interesting trends from available data. They innovate and resurrect the entire new meaning from the existing data. They are artists in disguise of computer analyst. They look at the data traditionally as well as explore various new ways to look at the data. Data Scientists do not wait to build their solutions from existing data. They think creatively, they think before the data has entered into the system. Data Scientists are visionary experts who understands the business needs and plan ahead of the time, this tremendously help to build solutions at rapid speed. Besides being data expert, the major quality of Data Scientists is “curiosity”. They always wonder about what more they can get from their existing data and how to get maximum out of future incoming data. Data Scientists do wonders with the data, which goes beyond the job descriptions of Data Analysist or Business Analysist. Skills Required for Data Scientists Here are few of the skills a Data Scientist must have. Expert level skills with statistical tools like SAS, Excel, R etc. Understanding Mathematical Models Hands-on with Visualization Tools like Tableau, PowerPivots, D3. j’s etc. Analytical skills to understand business needs Communication skills On the technology front any Data Scientists should know underlying technologies like (Hadoop, Cloudera) as well as their entire ecosystem (programming language, analysis and visualization tools etc.) . Remember that for becoming a successful Data Scientist one require have par excellent skills, just having a degree in a relevant education field will not suffice. Final Note Data Scientists is indeed very exciting job profile. As per research there are not enough Data Scientists in the world to handle the current data explosion. In near future Data is going to expand exponentially, and the need of the Data Scientists will increase along with it. It is indeed the job one should focus if you like data and science of statistics. Courtesy: emc Tomorrow In tomorrow’s blog post we will discuss about various Big Data Learning resources. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Dawn of the Enterprise Social Developer

    - by Mike Stiles
    Social is not just for poking friends, posting videos of cats playing pianos, or even just for brand marketing anymore. It has become a key form of communication internally and externally across every area of the enterprise. As a Java developer, are you positioning yourself for the integration of social into enterprise business systems that’s on the near horizon? Because it’s the work you do and the applications you build that will influence what the social-enabled enterprise is going to look like and how it’s going to operate. But as a social developer, step one is wrapping your arms around all the things that are possible. Traditionally, the best exploration, brainstorming and innovation come from collaborating with other developers. That’s how the big questions can be hashed (or hacked) out. Is Java the best social development environment? If not, what is? What’s already being done in terms of application integration? The JavaOne Social Developer Program will offer up a series of talks and events on those very issues Tuesday, October 2 at the San Francisco Hilton. If you’re interested in embarking on this newest frontier of enterprise social development, you can connect with others who are thinking the same thing and get moving on your first project.Talks will include: Emergence Of The Social EnterpriseExtending Social into Enterprise Applications and Business ProcessesIntro to Open Graph and Facebook's APIs Building the Next Wave of Social Commerce Platforms Social Data and the Enterprise LinkedIn: A Professional Network Built with Java Technologies and Agile Practice Social Developer Hackathon In addition to these learning and discussion opportunities, you might consider joining the new Oracle Social Developer Community (OSDC), where the interaction and collaboration can continue indefinitely. It doesn’t take a lot of tea leaf reading to know that the cloud will house the enterprise technology of the future, and social (as well as the rich data it brings) is going to be a major part of that as social integrates across every business function as there’s proven value for consumer facing initiatives. The next phase of social development is going to involve combining enterprise data from multiple sources, new and existing, social and traditional, in order to tell compelling and usable stories. And social is coming to the enterprise quickly, meaning you as a development leader should seek to understand not just what's worked on the consumer side, but what aspects of those successes can be applied inside the organization. Get educated, get connected, and consider registering for this forward-looking event now to get started with enterprise social development.

    Read the article

  • The environment that is uniquely Oracle by Phillip Yi

    - by Nadiya
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 In the past month, I have been given the exclusive opportunity to hire a Legal graduate/intern for Oracle’s in-house Legal Counsel based here in North Ryde, Sydney. Whilst talking to various applicants, I am asked the same, broad question – what are we looking for? Time and time I have spoken about targeting the best, or targeting the best fit. I am an advocate of the latter, hence when approaching this question I answer very simply – ‘we are looking for the individual, that will fit into the culture and environment that is uniquely Oracle’. So, what is the environment/culture like here at Oracle? What makes Oracle so unique and a great place to work, especially as a graduate? Much like our business model, we are forward and innovative thinkers – we are not afraid to try new things, whether it is a success or failure. We are all highly driven, motivated and successful individuals – Oracle is a firm believer that in order to be driven, motivated and successful, you need to be surrounded by like minded people. And last, we are all autonomous and independent, self starters – at Oracle you are treated as an adult. We are not in the business of continually micro managing, nor constantly spoon feeding or holding your hand. Oracle has an amazing support, resource and training network – if you need support, extra training or resources it is there for your taking. And of course, if you do it on your own accord, you will learn it much quicker. For those reasons, Oracle is unique in its environment – we ensure and set up everyone for success. With such a great working environment/culture, why wouldn’t you choose Oracle? /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-ascii- mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi- mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • How should an object that uses composition set its composed components?

    - by Casey
    After struggling with various problems and reading up on component-based systems and reading Bob Nystrom's excellent book "Game Programming Patterns" and in particular the chapter on Components I determined that this is a horrible idea: //Class intended to be inherited by all objects. Engine uses Objects exclusively. class Object : public IUpdatable, public IDrawable { public: Object(); Object(const Object& other); Object& operator=(const Object& rhs); virtual ~Object() =0; virtual void SetBody(const RigidBodyDef& body); virtual const RigidBody* GetBody() const; virtual RigidBody* GetBody(); //Inherited from IUpdatable virtual void Update(double deltaTime); //Inherited from IDrawable virtual void Draw(BITMAP* dest); protected: private: }; I'm attempting to refactor it into a more manageable system. Mr. Nystrom uses the constructor to set the individual components; CHANGING these components at run-time is impossible. It's intended to be derived and be used in derivative classes or factory methods where their constructors do not change at run-time. i.e. his Bjorne object is just a call to a factory method with a specific call to the GameObject constructor. Is this a good idea? Should the object have a default constructor and setters to facilitate run-time changes or no default constructor without setters and instead use a factory method? Given: class Object { public: //...See below for constructor implementation concerns. Object(const Object& other); Object& operator=(const Object& rhs); virtual ~Object() =0; //See below for Setter concerns IUpdatable* GetUpdater(); IDrawable* GetRenderer(); protected: IUpdatable* _updater; IDrawable* _renderer; private: }; Should the components be read-only and passed in to the constructor via: class Object { public: //No default constructor. Object(IUpdatable* updater, IDrawable* renderer); //...remainder is same as above... }; or Should a default constructor be provided and then the components can be set at run-time? class Object { public: Object(); //... SetUpdater(IUpdater* updater); SetRenderer(IDrawable* renderer); //...remainder is same as above... }; or both? class Object { public: Object(); Object(IUpdater* updater, IDrawable* renderer); //... SetUpdater(IUpdater* updater); SetRenderer(IDrawable* renderer); //...remainder is same as above... };

    Read the article

  • Modernizr Rocks HTML5

    - by Laila
    HTML5 is a moving target.  At the moment, we don't know what will be in future versions.  In most circumstances, this really matters to the developer. When you're using Adobe Air, you can be reasonably sure what works, what is there, and what isn't, since you have a version of the browser built-in. With Metro, you can assume that you're going to be using at least IE 10.   If, however,  you are using HTML5 in a web application, then you are going to rely heavily on Feature Detection.  Feature-Detection is a collection of techniques that tell you, via JavaScript, whether the current browser has this feature natively implemented or not Feature Detection isn't just there for the esoteric stuff such as  Geo-location,  progress bars,  <canvas> support,  the new <input> types, Audio, Video, web workers or storage, but is required even for semantic markup, since old browsers make a pigs ear out of rendering this.  Feature detection can't rely just on reading the browser version and inferring from that what works. Instead, you must use JavaScript to check that an HTML5 feature is there before using it.  The problem with relying on the user-agent is that it takes a lot of historical data  to work out what version does what, and, anyway, the user-agent can be, and sometimes is, spoofed. The open-source library Modernizr  is just about the most essential  JavaScript library for anyone using HTML5, because it provides APIs to test for most of the CSS3 and HTML5 features before you use them, and is intelligent enough to alter semantic markup into 'legacy' 'markup  using shims  on page-load  for old browsers. It also allows you to check what video Codecs are installed for playing video. It also provides media queries  and conditional resource-loading (formerly YepNope.js.).  Generally, Modernizr gives you the choice of what you do about browsers that don't support the feature that you want. Often, the best choice is graceful degradation, but the resource-loading feature allows you to dynamically load JavaScript Shims to replace the standard API for missing or defective HTML5 functionality, called 'PolyFills'.  As the Modernizr site says 'Yes, not only can you use HTML5 today, but you can use it in the past, too!' The evolutionary progress of HTML5  requires a more defensive style of JavaScript programming where the programmer adopts a mindset of fearing the worst ( IE 6)  rather than assuming the best, whilst exploiting as many of the new HTML features as possible for the requirements of the site or HTML application.  Why would anyone want the distraction of developing their own techniques to do this when  Modernizr exists to do this for you? Laila

    Read the article

  • Wireless Problem on HP Pavillion G6

    - by user47954
    I have a broadcom wireless card in my laptop and the wireless is not working correctly. right in front of the router the wireless signal is 70% and across the room it barely works and in another room it disconnects. i have the drivers and everything. i am running ubuntu 11.10 64bit.it works perfectly in windows 7. can anyone help! 00:00.0 Host bridge [0600]: Advanced Micro Devices [AMD] Device [1022:1705] 00:01.0 VGA compatible controller [0300]: ATI Technologies Inc Device [1002:9649] 00:01.1 Audio device [0403]: ATI Technologies Inc Device [1002:1714] Kernel driver in use: HDA Intel Kernel modules: snd-hda-intel 00:04.0 PCI bridge [0604]: Advanced Micro Devices [AMD] Device [1022:1709] Kernel driver in use: pcieport Kernel modules: shpchp 00:11.0 SATA controller [0106]: Advanced Micro Devices [AMD] Device [1022:7804] Kernel driver in use: ahci Kernel modules: ahci 00:12.0 USB Controller [0c03]: Advanced Micro Devices [AMD] Device [1022:7807] (rev 11) Kernel driver in use: ohci_hcd 00:12.2 USB Controller [0c03]: Advanced Micro Devices [AMD] Device [1022:7808] (rev 11) Kernel driver in use: ehci_hcd 00:13.0 USB Controller [0c03]: Advanced Micro Devices [AMD] Device [1022:7807] (rev 11) Kernel driver in use: ohci_hcd 00:13.2 USB Controller [0c03]: Advanced Micro Devices [AMD] Device [1022:7808] (rev 11) Kernel driver in use: ehci_hcd 00:14.0 SMBus [0c05]: Advanced Micro Devices [AMD] Device [1022:780b] (rev 13) Kernel modules: i2c-piix4 00:14.2 Audio device [0403]: Advanced Micro Devices [AMD] Device [1022:780d] (rev 01) Kernel driver in use: HDA Intel Kernel modules: snd-hda-intel 00:14.3 ISA bridge [0601]: Advanced Micro Devices [AMD] Device [1022:780e] (rev 11) 00:14.4 PCI bridge [0604]: Advanced Micro Devices [AMD] Device [1022:780f] (rev 40) 00:15.0 PCI bridge [0604]: Advanced Micro Devices [AMD] Device [1022:43a0] Kernel driver in use: pcieport Kernel modules: shpchp 00:15.1 PCI bridge [0604]: Advanced Micro Devices [AMD] Device [1022:43a1] Kernel driver in use: pcieport Kernel modules: shpchp 00:15.2 PCI bridge [0604]: Advanced Micro Devices [AMD] Device [1022:43a2] Kernel driver in use: pcieport Kernel modules: shpchp 00:18.0 Host bridge [0600]: Advanced Micro Devices [AMD] Device [1022:1700] (rev 43) 00:18.1 Host bridge [0600]: Advanced Micro Devices [AMD] Device [1022:1701] 00:18.2 Host bridge [0600]: Advanced Micro Devices [AMD] Device [1022:1702] 00:18.3 Host bridge [0600]: Advanced Micro Devices [AMD] Device [1022:1703] 00:18.4 Host bridge [0600]: Advanced Micro Devices [AMD] Device [1022:1704] 00:18.5 Host bridge [0600]: Advanced Micro Devices [AMD] Device [1022:1718] 00:18.6 Host bridge [0600]: Advanced Micro Devices [AMD] Device [1022:1716] 00:18.7 Host bridge [0600]: Advanced Micro Devices [AMD] Device [1022:1719] 01:00.0 Ethernet controller [0200]: Realtek Semiconductor Co., Ltd. RTL8101E/RTL8102E PCI Express Fast Ethernet controller [10ec:8136] (rev 05) Kernel driver in use: r8169 Kernel modules: r8169 07:00.0 Network controller [0280]: Broadcom Corporation Device [14e4:4727] (rev 01) Kernel driver in use: wl Kernel modules: wl 08:00.0 Class [ff00]: Realtek Semiconductor Co., Ltd. Device [10ec:5209] (rev 01)

    Read the article

  • JCP Awards 10 Year Retrospective

    - by Heather VanCura
    As we celebrate 10 years of JCP Program Award recognition in 2012,  take a look back in the Retrospective article covering the history of the JCP awards.  Most recently, the JCP awards were  celebrated at JavaOne Latin America in Brazil, where SouJava was presented the JCP Member of the Year Award for 2012 (won jointly with the London Java Community) for their contributions and launch of the Global Adopt-a-JSR Program. This is also a good time to honor the JCP Award Nominees and Winners who have been designated as Star Spec Leads.  Spec Leads are key to the Java Community Process (JCP) program. Without them, none of the Java Specification Requests (JSRs) would have begun, much less completed and become implemented in shipping products.  Nominations for 2012 Start Spec Leads are now open until 31 December. The Star Spec Lead program recognizes Spec Leads who have repeatedly proven their merit by producing high quality specifications, establishing best practices, and mentoring others. The point of such honor is to endorse the good work that they do, showcase their methods for other Spec Leads to emulate, and motivate other JCP program members and participants to get involved in the JCP program. Ed Burns – A Star Spec Lead for 2009, Ed first got involved with the JCP program when he became co-Spec Lead of JSR 127, JavaServer Faces (JSF), a role he has continued through JSF 1.2 and now JSF 2.0, which is JSR 314. Linda DeMichiel – Linda thus involved in the JCP program from its very early days. She has been the Spec Lead on at least three JSRs and an EC member for another three. She holds a Ph.D. in Computer Science from Stanford University. Gavin King – Nominated as a JCP Outstanding Spec Lead for 2010, for his work with JSR 299. His endorsement said, “He was not only able to work through disputes and objections to the evolving programming model, but he resolved them into solutions that were more technically sound, and which gained support of its pundits.” Mike Milikich –  Nominated for his work on Java Micro Edition (ME) standards, implementations, tools, and Technology Compatibility Kits (TCKs), Mike was a 2009 Star Spec Lead for JSR 271, Mobile Information Device Profile 3. David Nuescheler – Serving as the CTO for Day Software, acquired by Adobe Systems, David has been a key player in the growth of the company’s global content management solution. In 2002, he became Spec Lead for JSR 170, Content Repository for Java Technology API, continuing for the subsequent version, JSR 283. Bill Shannon – A well-respected name in the Java community, Bill came to Oracle from Sun as a Distinguished Engineer and is still performing at full speed as Spec Lead for JSR 342, Java EE 7,  as an alternate EC member, and hands-on problem solver for the Java community as a whole. Jim Van Peursem – Jim holds a PhD in Computer Engineering. He was part of the Motorola team that worked with Sun labs on the Spotless VM that became the KVM. From within Motorola, Jim has been responsible for many aspects of Java technology deployment, from an independent Connected Limited Device Configuration (CLDC) and Mobile Information Device Profile (MIDP) implementations, to handset development, to working with the industry in defining many related standards. Participation in the JCP Program goes well beyond technical proficiency. The JCP Awards Program is an attempt to say “Thank You” to all of the JCP members, Expert Group Members, Spec Leads, and EC members who give their time to contribute to the evolution of Java technology.

    Read the article

  • Getting Started with NASM

    - by MarkPearl
    Today I got to play with NASM. This is an assembler and disassembler that can be used to write 16-bit, 32-bit & 64-bit programs. Let me say upfront that the last time I looked at assembly code at any depth was when I was studying Computer Science in Pietermaritzburg – ten years ago – and we never ever got to touch any real assembly code so a lot of what I am looking at today is very new to me. The first thing I did was download NASM compiler. This turned out to be a bit more complicated than I thought. Originally I went to http://www.nasm.us/ and downloaded the nasm-2.09.04.zip file which I thought had all I needed. No luck! It seemed to just have the uncompiled code, and from what I could tell I would need to recompile and build it – possibly in c++? Well, I wasn’t going to waste my time with that, so a bit more searching and I found the Win32 (http://www.nasm.us/pub/nasm/releasebuilds/2.09.04/win32/) folder Nasm.exe which I downloaded. Choosing an IDE So, I have NASM compiler but to compile anything you need to pass a string of special characters in the command prompt. That’s fine if I was going to just do one program once every couple of years, but since I am aiming to do quite a bit more exploration of NASM I began searching for an IDE. There were a few options, even apparently Visual Studio with a bit of tweeking could do the job, but from past experience I wanted to avoid the VS route as it can sometimes get confusing. I eventually settled on TextPad which I had used a few years ago for a similar project and it had been simple enough yet powerful enough to do the job. A bit of searching and I found a syntax file for NASM and everything seemed hunky dory. Configuring TextPad to run the NASM Compiler Next was to get TextPad to run the NASM compiler. TextPad has this external tools option that allows one to configure special commands. To simplify the process I first created a bat file in the NASM directory that allowed me to simply compile asm files. The bat file was called as.bat and had just one line of code… nasm -f bin %1.asm -o %1.com -l %1.lst Once I had created as.bat I just needed to go into TextPad and create a tool. I have made a quick video of that just showing you where the various settings are which is viewable below. The 64Bit Problem So I now have an ‘IDE’ linked to my NASM compiler so everything should be fine right? No! Whenever I tried to compile an asm program it compiles fine, but when I try and run it I get an error – “This version of the file is not compatible with the version Windows you’re running. Check your computer’s system information to see whether you need an x86 (32-bit) or x64 (64-bit) version of the program, and then contact the software publisher." Well.. it turns out there are a few complications with having a 64 bit OS! So after searching google and coming to any real solution that I could find other than perhaps attempting to build the code for nasm, I eventually resorted to running a VM with Windows XP on it and putting NASM there… My first hello world program So I attempt my first hello world program as per an example I found… the code was quite simple and is shown below… bits16 org 0x100 jmp main message: db 'Hello World',0ah,0dh,'$' main: mov dx,message mov ah,09 int 21h int 20h Running the build tool from TextPad and everything compiles fine and I now have a console app with helllo world shown. Conclusion It’s very early days with NASM. I have been spoilt with Visual Studio and high order languages so I assume it will be a painful ride getting into the basics of assembly programming but I am hoping that at the end of it, I will at least have a bit more exposure to a language closer to the metal.

    Read the article

  • Sweet and Sour Source Control

    - by Tony Davis
    Most database developers don't use Source Control. A recent anonymous poll on SQL Server Central asked its readers "Which Version Control system do you currently use to store you database scripts?" The winner, with almost 30% of the vote was...none: "We don't use source control for database scripts". In second place with almost 28% of the vote was Microsoft's VSS. VSS? Given its reputation for being buggy, unstable and lacking most of the basic features required of a proper source control system, answering VSS is really just another way of saying "I don't use Source Control". At first glance, it's a surprising thought. You wonder how database developers can work in a team and find out what changed, when the system worked before but is now broken; to work out what happened to their changes that now seem to have vanished; to roll-back a mistake quickly so that the rest of the team have a functioning build; to find instantly whether a suspect change has been deployed to production. Unfortunately, the survey didn't ask about the scale of the database development, and correlate the two questions. If there is only one database developer within a schema, who has an automated approach to regular generation of build scripts, then the need for a formal source control system is questionable. After all, a database stores far more about its metadata than a traditional compiled application. However, what is meat for a small development is poison for a team-based development. Here, we need a form of Source Control that can reconcile simultaneous changes, store the history of changes, derive versions and builds and that can cope with forks and merges. The problem comes when one borrows a solution that was designed for conventional programming. A database is not thought of as a "file", but a vast, interdependent and intricate matrix of tables, indexes, constraints, triggers, enumerations, static data and so on, all subtly interconnected. It is an awkward fit. Subversion with its support for merges and forks, and the tolerance of different work practices, can be made to work well, if used carefully. It has a standards-based architecture that allows it to be used on all platforms such as Windows Mac, and Linux. In the words of Erland Sommerskog, developers should "just do it". What's in a database is akin to a "binary file", and the developer must work only from the file. You check out the file, edit it, and save it to disk to compile it. Dependencies are validated at this point and if you've broken anything (e.g. you renamed a column and broke all the objects that reference the column), you'll find out about it right away, and you'll be forced to fix it. Nevertheless, for many this is an alien way of working with SQL Server. Subversion is the powerhouse, not the GUI. It doesn't work seamlessly with your existing IDE, and that usually means SSMS. So the question then becomes more subtle. Would developers be less reluctant to use a fully-featured source (revision) control system for a team database development if they had a turn-key, reliable system that fitted in with their existing work-practices? I'd love to hear what you think. Cheers, Tony.

    Read the article

  • Ubuntu 12.10 Trackpoint not detected Thinkpad X130e

    - by killerknives
    I just did a fresh install of 12.10 and now only my touchpad works. The trackpoint and Left/Right buttons below the trackpoint do not work. The trackpoint worked fine as of 12.04. I've searched online and there was a hack that said that disabling the touchpad would enable the trackpoint. WRONG! You'll end up having to use the keyboard =/. I don't know what is needed so I'll just dump some stuff. lspci: 00:00.0 Host bridge: Advanced Micro Devices [AMD] Family 14h Processor Root Complex 00:01.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI Wrestler [Radeon HD 6310] 00:01.1 Audio device: Advanced Micro Devices [AMD] nee ATI Wrestler HDMI Audio [Radeon HD 6250/6310] 00:05.0 PCI bridge: Advanced Micro Devices [AMD] Family 14h Processor Root Port 00:06.0 PCI bridge: Advanced Micro Devices [AMD] Family 14h Processor Root Port 00:07.0 PCI bridge: Advanced Micro Devices [AMD] Family 14h Processor Root Port 00:11.0 SATA controller: Advanced Micro Devices [AMD] nee ATI SB7x0/SB8x0/SB9x0 SATA Controller [AHCI mode] 00:12.0 USB controller: Advanced Micro Devices [AMD] nee ATI SB7x0/SB8x0/SB9x0 USB OHCI0 Controller 00:12.2 USB controller: Advanced Micro Devices [AMD] nee ATI SB7x0/SB8x0/SB9x0 USB EHCI Controller 00:13.0 USB controller: Advanced Micro Devices [AMD] nee ATI SB7x0/SB8x0/SB9x0 USB OHCI0 Controller 00:13.2 USB controller: Advanced Micro Devices [AMD] nee ATI SB7x0/SB8x0/SB9x0 USB EHCI Controller 00:14.0 SMBus: Advanced Micro Devices [AMD] nee ATI SBx00 SMBus Controller (rev 42) 00:14.2 Audio device: Advanced Micro Devices [AMD] nee ATI SBx00 Azalia (Intel HDA) (rev 40) 00:14.3 ISA bridge: Advanced Micro Devices [AMD] nee ATI SB7x0/SB8x0/SB9x0 LPC host controller (rev 40) 00:14.4 PCI bridge: Advanced Micro Devices [AMD] nee ATI SBx00 PCI to PCI Bridge (rev 40) 00:18.0 Host bridge: Advanced Micro Devices [AMD] Family 12h/14h Processor Function 0 (rev 43) 00:18.1 Host bridge: Advanced Micro Devices [AMD] Family 12h/14h Processor Function 1 00:18.2 Host bridge: Advanced Micro Devices [AMD] Family 12h/14h Processor Function 2 00:18.3 Host bridge: Advanced Micro Devices [AMD] Family 12h/14h Processor Function 3 00:18.4 Host bridge: Advanced Micro Devices [AMD] Family 12h/14h Processor Function 4 00:18.5 Host bridge: Advanced Micro Devices [AMD] Family 12h/14h Processor Function 6 00:18.6 Host bridge: Advanced Micro Devices [AMD] Family 12h/14h Processor Function 5 00:18.7 Host bridge: Advanced Micro Devices [AMD] Family 12h/14h Processor Function 7 01:00.0 Network controller: Realtek Semiconductor Co., Ltd. RTL8188CE 802.11b/g/n WiFi Adapter (rev 01) 02:00.0 Ethernet controller: Atheros Communications Inc. AR8151 v2.0 Gigabit Ethernet (rev c0) I've installged gpointing-device-setting and it only lists the touchpad. No trackpoint. What should I do? What data do you need?

    Read the article

  • User connection management in Reporting Services configuration

    - by Testas
    IT professionals will use Reporting Services Configuration Manager to perform post installation tasks for SQL Server Reporting Services. Introduced in SQL Server 2005, Reporting Services Configuration Manager provides an intuitive interface to perform tasks including specifying the report server database, report manager url, and indeed one of the first post installation tasks that should be performed is backing up the encryption keys that are used to protect the sensitive information within the rdl files.  Many of the options that are selected within Reporting Services Configuration Manager are written to a number of configuration files including the rsreportserver.config file located in C:\Program Files\Microsoft SQL Server\Report Server InstanceName\Reporting Services\ReportServer folder.When opening this file you will notice that there are more configuration settings within the rsreportserver.config file than is available through the Reporting Services Configuration Manager Interface. As a result there are additional configuration options that can be defined within this file.  A customer was having a problem performing stress tests against a new Report Server that would be going live for an enterprise reporting system. One aspect of the stress test was to fire 50 connections from a single user account. When performing the stress test an error described that the maximum active request had been exceeded. Within the rsreportserver.config, there is a key that is added to the file:  <Add Key=”MaxActiveReqForOneUser” Value=”20”/>  Changing the value from 20 to 50 accommodated the needs of the stress test, however, a wider question should be asked pertaining to this setting when implementing Reporting Services to a production environment. Within an intranet environment, the default setting is appropriate when network bandwidth is high, users are known and demand for reports is particularly high from a group of users.  However, when deploying a Reporting Server solution to an extranet, or the internet, you may want to consider reducing this setting to reduce to scope of connections that can be acquired by a single user and placing unnecessary pressure on the report server. I do hope that Reporting Services Configuration Manager evolves to include an advanced page that includes an intuitive interface to change configuration settings such as the MaxActiveReqForOneUser, and also configure rendering and data extensions and define secure connection levels to the report server. All these options can be configured within the rsreportserver.config file, and these are setting that customers would like to see in Reporting Services Configuration Manager in the future.   If you think that the SQL community would benefit from this addition, you can vote on it at Microsoft Connect  https://connect.microsoft.com/SQLServer/feedback/details/565575/extending-reporting-services-configuration-manager-rscm    

    Read the article

  • Using ext4 in VMware machine

    First of all, using a journaling filesystems like NTFS, ext4, XFS, or JFS (not to name all of them) is a very good idea and nowadays unthinkable not to do. Linux offers a good variety of different option as journaling filesystem for your system. Since years I am using SGI's XFS and I am pretty confident with stability, performance and liability of the system. In earlier years I had to struggle with incompatibilities between XFS and the boot loader. Using an ext2 formatted /boot solved this issue. But, wow, that is ages ago! Lately, I had to setup a fresh Lucid Lynx (Ubuntu 10.04 LTS) system for a change of our internal groupware / messaging system. Therefore, I fired up a new virtual machine with almost standard configuration in VMware Server and run through our network-based PXE boot and installation procedure. At a certain step in this process, Ubuntu asks you about the partitioning of your hard drive(s). Honestly, I have to say that only out of curiousity I sticked to the "default" suggestion and gave my faith and trust into the Ubuntu installation routine... Resulting to have an ext4 based root mount point ( / ). The rest of the installation went on without further concerns or worries. Note:I really can't remember why I chose to go away from my favourite... Well, it should turn out to be the wrong decision after all. Ok, let's continue the story about ext4 in a VMware based virtual machine. After some hours installing additional packages and configuring the new system using LDAP for general authentication and login, I had an "out-of-the-box" usable enterprise messaging system based on Zarafa 6.40 Community Edition inclusive proper SSL-based Webaccess interface and Z-Push extension for ActiveSync with my Nokia mobile. Straightforward and pretty nice for the time spent on the setup. Having priority on other tasks I let the system just running and didn't pay any further attention at all. Until I run into an upgrade of "Mail for Exchange" on Symbian OS. My mobile did not bother me at all with the upgrade and everything went smooth, but trying to re-establish the ActiveSync connection to the Zarafa messaging system resulted in a frustating situation. So, I shifted my focus back to the Linux system and I was amazed to figure out that the root had been remounted readonly due to hard drive failures or at least ext4 reported errors. Firing up Google only confirmed my concerns and it seems that using ext4 for VMware based virtual machines does not look like a stable and reliable candidate to me. You might consider reading those external resources: ext4 fs corruption under VMWare Server 2.01Bug #389555 - ext4 filesystem corruption Well, I learned my lesson and ext{2|3|4} based filesystems are not going to be used on any of my Linux systems or customer installations in the future. Addendum: I did not try this setup in other virtualization environments like VirtualBox, qemu, kvm, Xen, etc.

    Read the article

  • Windows Phone 7 Silverlight / XNA development talk

    - by subodhnpushpak
    Hi, I presented on Windows Phone 7 app development using Silverlight. Here are few pics from the event Windows Phone 7 development VIEW SLIDE SHOW DOWNLOAD ALL     I demonstrated the Visual studio, emulator capabilities/ features. An demo on Wp7 app communication with an OData Service, along with a demo on XNA app. There was lot of curious questions; I am listing them here because these keep on popping up again and again: 1. What tools does it takes to develop Wp7 app? Are they free? A typical WP7 app can be developed either using Silverlight or XNA. For developers, Visual Studio 2010 is a good choice as it provides an integrated development environment with lots of useful project templates; which makes the task really easy. For designers, Blend may be used to develop the UI in XAML. Both the tools are FREE (express version) to download and very intuitive to use. 2. What about the learning curve? If you know C#, (or any other programming language), learning curve is really flat. XAML (used for UI) may be new for you, but trust me; its very intuitive. Also you can use Microsoft Blend to generate the UI (XAML) for you. 3. How can I develop /test app without using actual device? How can I be sure my app runs as expected on actual device? The WP7 SDK comes along with an excellent emulator; which you can use for development/ testing on a computer. Later you can just change a setting and deploy the application on WP7. You will require Zune software for deploying the application on phone along with Developers key from WP7 marketplace. You can obtain key from marketplace by filling a form. The whole process for registering  is easy; just follow the steps on the site. 4. Which one should I use? Silverlight or XNA? Use Silverlight for enterprise/ business / utility apps. Use XNA for Games app. While each platform is capable / strong and may be used in conjunction as well; The methodologies used for development in these platforms are very different. XNA works on typical Do..While loop where as Silverlight works on event based methodology. 5. Where are the learning resources? Are they free? There is lots of stuff on WP7. Most of them are free. There is a excellent free book by Charles Petzold to download and http://www.microsoft.com/windowsphone is full of demos /todos / vidoes. All the exciting stuff was captured live and you can view it here; in case you were not able to catch it live!! @ http://livestre.am/AUfx. My talk starts from 3:19:00 timeline in the video!! Is there an app you miss on WP7? Do let me know about it and I may work on it for free !!! Keep discovering. Keep is Simple. WP7. Subodh

    Read the article

  • Happy New Year! Upcoming Events in January 2011

    - by mandy.ho
    Oracle Database kicks off the New Year at the following events during the month of January. Hope to see you there and please send in your pictures and feedback! Jan 20, 2011 - San Francisco, CA LinkShare Symposium West 2011 Oracle is a proud Gold Sponsor at the LinkShare Symposium West 2011 January 20 in San Francisco, California. Year after year LinkShare has been bringing their network the opportunity to come to life. At the LinkShare Symposium online performance marketing leaders meet to optimize face-to-face during a full day of networking. Learn more by attending Oracle Breakout Session, "Omni - Channel Retailing, What is possible now?" on Thursday, January 20, 11:15 a.m. - 12:00 noon, Grand Ballroom. http://eventreg.oracle.com/webapps/events/ns/EventsDetail.jsp?p_eventId=128306&src=6954634&src=6954634&Act=397 Jan 24, 2011 - Cincinnati, OH Greater Cincinnati Oracle User Group Meeting "Tom Kyte Day" - Featuring a day of sessions presented by Senior Technical Architect, Tom Kyte. Sessions include "Top 10, no 11, new features of Oracle Database 11g Release 2" and "What do I really need to know when upgrading", plus more. http://www.gcoug.org/ Jan 25, 2011 - Vancouver, British Columbia Oracle Security Solutions Forum Featuring a Special Keynote Presentation from Tom Kyte - Complete Database Security Join us at this half-day event; Oracle Database Security Solutions: Complete Information Security. Learn how Oracle Database Security solutions help you: • Prevent external threats like SQL injection attacks from reaching your databases • Transparently encrypt application data without application changes • Prevent privileged database users and administrators from accessing data • Use native database auditing to monitor and report on database activity • Mask production data for safe use in nonproduction environments http://eventreg.oracle.com/webapps/events/ns/EventsDetail.jsp?p_eventId=126974&src=6958351&src=6958351&Act=97 Jan 26, 2011 - Halifax, Nova Scotia Oracle Database Security Technology Day Exclusive Seminar on Complete Information Security with Oracle Database 11g The amount of digital data within organizations is growing at unprecedented rates, as is the value of that data and the challenges of safeguarding it. Yet most IT security programs fail to address database security--specifically, insecure applications and privileged users. So how can you protect your mission-critical information? Avoid risky third-party solutions? Defend against security breaches and compliance violations? And resist costly new infrastructure investments? Join us at this half-day seminar, Oracle Database Security Solutions: Complete Information Security, to find out http://eventreg.oracle.com/webapps/events/ns/EventsDetail.jsp?p_eventId=126269&src=6958351&src=6958351&Act=93

    Read the article

  • ArchBeat Link-o-Rama for 11/17/2011

    - by Bob Rhubart
    Building an Infrastructure Cloud with Oracle VM for x86 + Enterprise Manager 12c | Richard Rotter Richard Rotter demonstrates "how easy it could be to build a cloud infrastructure with Oracle's solution for cloud computing." Article: Social + Lean = Agile | Dave Duggal In today’s increasingly dynamic business environment, organizations must continuously adapt to survive. Change management has become a major bottleneck. Organizations’ need a practical mechanism for managing controlled variance and change in-flight to break the logjam. This paper provides a foundation for applying lean and agile principles to achieve Enterprise Agility through social collaboration. Stress Testing Java EE 6 Applications - Free Article In Free Java Magazine : Adam Bien "It is strange," says Adam Bien, "everyone is obsessed about green bars and code coverage, but testing of multi threaded behavior is widely ignored - until the applications run into massive problems." Using Access Manager to Secure Applications Deployed on WebLogic | Rene van Wijk Another great how-to post from Oracle ACE Rene van Wijk, this time involving JBoss RichFaces, Facelets, Oracle Coherence, and Oracle WebLogic Server. DOAG 2011 vs. Devoxx - Value and Attraction | Markus Eisele Oracle ACE Director Markus Eisele compares and contrasts these popular conferences with the aim of helping others decide which to attend. SOA All the Time; Architects in AZ; Clearing Info Integration hurdles SOA all the Time; Architects in AZ; Clearing Info Integration Hurdles This week on the Architect Home Page on OTN. Webcast: Oracle Business Intelligence Mobile Event Date: Wednesday, December 7, 2011 Time: 10 a.m. PT/1 p.m. ET Featuring Manan Goel (Director BI Product Marketing, Oracle) and Shailesh Shedge (Director BI and Analytics Practice, Ascentt). Webcast: Maximum Availability on Private Clouds A discussion of Oracle’s Maximum Availability Architecture, Oracle Database 11g, Oracle Exadata Database Machine, and Oracle Database appliance, featuring Margaret Hamburger (Director, Product Marketing, Oracle) and Joe Meeks (Director, Product Management, Oracle). November 30, 2011 at 10:00am PT / 1:00pm ET. Oracle Technology Network Architect Day - Phoenix, AZ Wednesday December 14, 2011, 8:30am - 5:00pm. The Ritz-Carlton, Phoenix, 2401 East Camelback Road, Phoenix, AZ 85016. Registration is free, but seating is limited.

    Read the article

  • What's Old is New Again

    - by David Dorf
    Last night I told my son he could stream music to his tablet "from the cloud" (in this case, the Amazon Cloud).  He paused, then said, "what is the cloud?"  I replied, "a bunch of servers connected to the internet."  Apparently he had visions of something much more magnificent.  Another similar term is "big data."  These marketing terms help to quickly convey topics but are oversimplifications that are open to many interpretations.  At their core, those terms a shiny packages holding recycled ideas. I see many headlines declaring big data changes everything, but it doesn't.  Savvy retailers have been dealing with large volumes of data since the electronic cash register was invented.  But the there have a been a few changes to the landscape that make big data a topic of conversation: 1. Computing power has caught up to storage volumes. Its now possible to more thoroughly analyze the copious volumes of data retailers have been squirreling away.  CPUs are faster, sold state drives more plentiful, and new ways to store and search data are available.  My iPhone is more power than the computer used in the Apollo mission to the moon. 2. Unstructured data is everywhere.  The Web used to be where retailers published product information, but now users are generating the bulk of the content in the form of comments, videos, and "likes."  The variety of information available to retailers is huge, and it meaning difficult to discern. 3. Everything is connected.  Looking at a report from my router, there are no less than 20 active devices on my home network.  We can track the location of mobile phones, tag products with RFID, and set our thermostats (I love my Nest) from a thousand miles away.  Not only is there more data, but its arriving at higher velocity. Careful readers will note the three Vs that help define so-called big data: volume, variety, and velocity. We now have more volume, more variety, and more velocity and different technologies to deal with them.  But at the heart, the objectives are still the same: Informed decisions Accurate forecasts Improved optimizations So don't let the term "big data" throw you off the scent.  Retailers still need to execute on the basics.  But do take a fresh look at the data that's available and the new technologies to process it.  The landscape will continue to change and agile organizations will always be reevaluating their approaches.  You can just add some more weapons to the arsenal.

    Read the article

  • From DBA to Data Analyst

    - by Denise McInerney
    Cross posted from the PASS Blog There is a lot changing in the data professional’s world these days. More data is being produced and stored. More enterprises are trying to use that data to improve their products and services and understand their customers better. More data platforms and tools seem to be crowding the market. For a traditional DBA this can be a confusing and perhaps unsettling time. It’s also a time that offers great opportunity for career growth. I speak from personal experience. We sometimes refer to the “accidental DBA”, the person who finds herself suddenly responsible for managing the database because she has some other technical skills. While it was not accidental, six months ago I was unexpectedly offered a chance to transition out of my DBA role and become a data analyst. I have since come to view this offer as a gift, though at the time I wasn’t quite sure what to do with it. Throughout my DBA career I’ve gotten support from my PASS friends and colleagues and they were the first ones I turned to for counsel about this new situation. Everyone was encouraging and I received two pieces of valuable advice: first, leverage what I already know about data and second, work to understand the business’ needs. Bringing the power of data to bear to solve business problems is really the heart of the job. The challenge is figuring out how to do that. PASS had been the source of much of my technical training as a DBA, so I naturally started there to begin my Business Intelligence education. Once again the Virtual Chapter webinars, local chapter meetings and SQL Saturdays have been invaluable. I work in a large company where we are fortunate to have some very talented data scientists and analysts. These colleagues have been generous with their time and advice. I also took a statistics class through Coursera where I got a refresher in statistics and an introduction to the R programming language. And that’s not the end of the free resources available to someone wanting to acquire new skills. There are many knowledgeable Business Intelligence and Analytics professionals who teach through their blogs. Every day I can learn something new from one of these experts. Sometimes we plan our next career move and sometimes it just happens. Either way a database professional who follows industry developments and acquires new skills will be better prepared when change comes. Take the opportunity to learn something about the changing data landscape and attend a Business Intelligence, Business Analytics or Big Data Virtual Chapter meeting. And if you are moving into this new world of data consider attending the PASS Business Analytics Conference in April where you can meet and learn from those who are already on that road. It’s been said that “the only thing constant is change.” That’s never been more true for the data professional than it is today. But if you are someone who loves data and grasps its potential you are in the right place at the right time.

    Read the article

< Previous Page | 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069  | Next Page >