Search Results

Search found 6697 results on 268 pages for 'learning'.

Page 155/268 | < Previous Page | 151 152 153 154 155 156 157 158 159 160 161 162  | Next Page >

  • Oracle’s Java Community Outreach Plan

    - by Yolande Poirier
    As the steward of Java, Oracle recognizes the importance and value of the Java community, and the relevant role it plays in keeping Java the largest, most vibrant developer community in the world.   In order to increase Oracle’s touch with Java developers worldwide, we are shifting our focus from a flagship JavaOne event followed by several regional JavaOne conferences, to a new outreach model which continues with the JavaOne flagship event, as well as a mix of online content, regional Java Tours, and regional 3rd party event participation.  1. JavaOne JavaOne continues to remain the premier hub for Java developers where you are given the opportunity to improve your Java technical skills, and interact with other members of the Java community. JavaOne is centered on open collaboration and sharing, and Oracle will continue to invest in JavaOne as a unique stand-alone event for the Java community. Oracle recognizes that many developers cannot attend JavaOne in person, therefore Oracle will share the wealth of the unique event material to those developers through a new and easy-to-access online Java program. While online JavaOne content cannot address the importance of actual face-to-face community/developer engagements and networking, online content does aide in extending the Java technical learning opportunity to a broader collection of developers. 2. Java Developer Day Tours Oracle will execute regional Java Developer Days with recognized Java User Groups (JUGs) with participation from Java Evangelist and Java Champions. This allows local, regional specific Java topics to be addressed both by Oracle and the Java community. In addition, Oracle will deliver more virtual technical content programs to reach developers where an existing JUG may not have a presence. 3. Sponsorship of Community-Driven Regional Events/Conferences Oracle also recognizes that improved community dialog and relations are achievable by continued Oracle sponsorship and onsite participation at both established/well-recognized 3rd party events and new emerging/growing 3rd party events. Oracle’s ultimate goal is to be an even better steward for Java by reaching more of the Java ecosystem with face-to-face and online community engagements. We look forward to planning tours and events with you, members of the Java community.

    Read the article

  • Teaching high school kids ASP.NET programming

    - by dotneteer
    During the 2011 Microsoft MVP Global Summit, I have been talking to people about teaching kids ASP.NET programming. I want to work with volunteer organizations to provide kids volunteer opportunities while learning technical skills that can be applied elsewhere. The goal is to teach motivated kids enough skill to be productive with no more than 6 hours of instruction. Based on my prior teaching experience of college extension courses and involvement with high school math and science competitions, I think this is quite doable with classic ASP but a challenge with ASP.NET. I don’t want to use ASP because it does not provide a good path into the future. After some considerations, I think this is possible with ASP.NET and here are my thoughts: · Create a framework within ASP.NET for kids programming. · Use existing editor. No extra compiler and intelligence work needed. · Using a subset of C# like a scripting language. Teaches data type, expression, statements, if/for/while/switch blocks and functions. Use existing classes but no class creation and OOP. · Linear rendering model. No complicated life cycle. · Bare-metal html with some MVC style helpers for widget creation; ASP.NET control is optional. I want to teach kids to understand something and avoid black boxes as much as possible. · Use SQL for CRUD with a helper class. Again, I want to teach understanding rather than black boxes. · Provide a template to encourage clean separation of concern. · Provide a conversion utility to convert the code that uses template to ASP.NET MVC. This will allow kids with AP Computer Science knowledge to step up to ASP.NET MVC. Let me know if you have thoughts or can help.

    Read the article

  • printable PHP manual - 'all but the Function Reference section'

    - by JW01
    My Motivation I find it easier to learn things by reading 'offline'. I'd like to lean back and read the narrative part of a paper version of the official php manual. My Scuppered Plan My plan was to download the manual, print all but the Function Reference section and then read it. I have downloaded the "Single HTML file" version of the manual from the php.net download page. (That version did not contain any images, so I patched-in the ones from the Many HTML files version with no problem.) My plan was to open that "Single HTML file" in an HTML editor, delete the Function Reference section then print it out. Unfortunately, although I have tried three different editors, I have not been able to successfully load-up that massive html file to be able to edit it. Its about (~40MB). I started to look into the phpdoc framework with a view to rendering my own html docs from the source...but that's a steep learning curve for a newby..and is a last resort. I would use a file splitter, but they tend to split files crudely with no regard for html/xml/xhtml sematics. So the question is... Does anyone know know where you can download the php manual in a version that is a kind of half-way house between the 'Single HTML file' and the 'Many HTML files'? Ideally with the docs split into 3 parts: File 1 - stuff before the function reference File 2 - function reference File 3 - stuff after the function reference Or Can you suggest any editors/tools will enable me to split up this file myself?

    Read the article

  • How should I start with Lisp?

    - by Gary Rowe
    I've been programming for years now, working my way through various iterations of Blub (BASIC, Assembler, C, C++, Visual Basic, Java, Ruby in no particular order of "Blub-ness") and I'd like to learn Lisp. However, I have a lot of intertia what with limited time (family, full time job etc) and a comfortable happiness with my current Blub (Java). So my question is this, given that I'm someone who would really like to learn Lisp, what would be the initial steps to get a good result that demonstrates the superiority of Lisp in web development? Maybe I'm missing the point, but that's how I would initially see the application of my Lisp knowledge. I'm thinking "use dialect A, use IDE B, follow instructions on page C, question your sanity after monads using counsellor D". I'd just like to know what people here consider to be an optimal set of values for A, B, C and perhaps D. Also some discussion on the relative merit of learning such a powerful language as opposed to, say, becoming a Rails expert. Just to add some more detail, I'll be developing on MacOS (or a Linux VM) - no Windows based approaches will be necessary, thanks. Notes for those just browsing by I'm going to keep this question open for a while so that I can offer feedback on the suggestions after I've been able to explore them. If you happen to be browsing by and feel you have something to add, please do. I would really welcome your feedback. Interesting links Assuming you're coming at Lisp from a Java background, this set of links will get you started quickly. Using Intellij's La Clojure plugin to integrate Lisp (videocast) Lisp for the Web Online version of Practical Common Lisp (c/o Frank Shearar) Land of Lisp a (+ (+ very quirky) game based) way in but makes it all so straightforward

    Read the article

  • How is IntelliJ better than Eclipse?

    - by NickC
    I know there have been questions like What is your favorite editor/IDE?, but none of them have answered this question: Why spend the money on IntelliJ when Eclipse is free? I'm personally a big IntelliJ fan, but I haven't really tried Eclipse. I've used IntelliJ for projects that were Java, JSP, HTML/CSS, Javascript, PHP, and Actionscript, and the latest version, 9, has been excellent for all of them. Many coworkers in the past have told me that they believe Eclipse to be "pretty much the same" as IntelliJ, but, to counter that point, I've occasionally sat behind a developer using Eclipse who's seemed comparably inefficient (to accomplish roughly the same task), and I haven't experienced this with IntelliJ. They may be on par feature-by-feature but features can be ruined by a poor user experience, and I wonder if it's possible that IntelliJ is easier to pick up and discover time-saving features. For users who are already familiar with Eclipse, on top of the real cost of IntelliJ, there is also the cost of time spent learning the new app. Eclipse gets a lot of users who simply don't want to spend $250 on an IDE. If IntelliJ really could help my team be more productive, how could I sell it to them? For those users who've tried both, I'd be very interested in specific pros or cons either way.

    Read the article

  • How to review code that you do not understand?

    - by John Isaacks
    I have been given the role to improve development in our company. The first thing I wanted to start was code reviews since that has never been done here before. There are 3 programmers in our company. I am a web programmer, my known languages are mainly PHP, ActionScript and JavaScript. The other 2 developers write internal applications in VB.net We have been doing code reviews for a couple weeks now. I find it hard to understand VB code. So when they say what its doing, for the most part I just have to take their word for it. If I do see something that looks wrong, I explain my opinion and I explain how I would address it in one of the languages I know. Sometimes my suggestions are welcomed but many times I am told things like "this is the best way of doing it in this language" or "that doesn't apply to this language" or similar things of that nature. This may be true, but without knowing the language I am not sure how to confirm or refute these claims. I know one possible solution would be to learn vb so I can do better code reviews. I really have no interest in learning vb (especially since I have a list of other technologies I am trying to learn for my own projects) and would like to keep this as a last resort but it is an option. Another idea that came to me is, they both have interest in C# and so do I. Its relative to them because its .net and relative to me because its more similar to the languages I know. Yet it is new to all of us. I thought about the benefits of us all collaborating on a pet C#.net project and reviewing each others code from that. I guess theres also the possibility hiring a consultant to come in and give us some code reviews. What would you recommend I do in this situation.

    Read the article

  • Gamification: Oracle Well and Truly Engaged

    - by ultan o'broin
    Here is a quick roundup of Oracle gamification events and activities. But first, some admissions to a mis-spent youth from Oracle vice presidents Jeremy Ashley, Nigel King, Mike Rulf, Dave Stephens, and Clive Swan, (the video was used as an introduction to the Oracle Applications User Experience Gamification Design Jam): Other videos from that day are available, including the event teaser A History of Games, and about UX and Gamification are here, and here. On to the specifics: Marta Rauch's (@martarauch) presentations Tapping Enterprise Communities Through Gamification at STC 2012 and Gamification is Here: Build a Winning Plan at LavaCon 2012. Erika Webb's (@erikanollwebb) presentation Enterprise User Experience: Making Work Engaging at Oracle at the G-Summit 2012. Kevin Roebuck's blog outlining his team's gamification engagements, including the G-Summit, Innovations in Online Learning, and the America's Cup for Java Kids Virtual Design Competition at the Immersive Education Summit. Kevin also attended the UX Design Jam. Jake Kuramoto (@jkuramot) of Oracle AppsLab's (@theappslab) thoughts on the Gamification Design Jam. Jake and Co have championed gamification in the apps space for a while now. If you know of more Oracle gamification events or articles of interest, then find the comments.

    Read the article

  • PHP Aspect Oriented Design

    - by Devin Dixon
    This is a continuation of this Code Review question. What was taken away from that post, and other aspect oriented design is it is hard to debug. To counter that, I implemented the ability to turn tracing of the design patterns on. Turning trace on works like: //This can be added anywhere in the code Run::setAdapterTrace(true); Run::setFilterTrace(true); Run::setObserverTrace(true); //Execute the functon echo Run::goForARun(8); In the actual log with the trace turned on, it outputs like so: adapter 2012-02-12 21:46:19 {"type":"closure","object":"static","call_class":"\/public_html\/examples\/design\/ClosureDesigns.php","class":"Run","method":"goForARun","call_method":"goForARun","trace":"Run::goForARun","start_line":68,"end_line":70} filter 2012-02-12 22:05:15 {"type":"closure","event":"return","object":"static","class":"run_filter","method":"\/home\/prodigyview\/public_html\/examples\/design\/ClosureDesigns.php","trace":"Run::goForARun","start_line":51,"end_line":58} observer 2012-02-12 22:05:15 {"type":"closure","object":"static","class":"run_observer","method":"\/home\/prodigyview\/public_html\/public\/examples\/design\/ClosureDesigns.php","trace":"Run::goForARun","start_line":61,"end_line":63} When the information is broken down, the data translates to: Called by an adapter or filter or observer The function called was a closure The location of the closure Class:method the adapter was implemented on The Trace of where the method was called from Start Line and End Line The code has been proven to work in production environments and features various examples of to implement, so the proof of concept is there. It is not DI and accomplishes things that DI cannot. I wouldn't call the code boilerplate but I would call it bloated. In summary, the weaknesses are bloated code and a learning curve in exchange for aspect oriented functionality. Beyond the normal fear of something new and different, what are other weakness in this implementation of aspect oriented design, if any? PS: More examples of AOP here: https://github.com/ProdigyView/ProdigyView/tree/master/examples/design

    Read the article

  • SQL SERVER – Cardinality Estimation and Performance – SQL in Sixty Seconds #072

    - by Pinal Dave
    Yesterday I wrote blog post based on my latest Pluralsight course on learning SQL Server 2014. I discussed newly introduced cardinality estimation in SQL Server 2014 and how it improves the performance of the query. The cardinality estimation logic is responsible for quality of query plans and majorly responsible for improving performance for any query. This logic was not updated for quite a while, but in the latest version of SQL Server 2104 this logic is re-designed. The new logic now incorporates various assumptions and algorithms of OLTP and warehousing workload. I hope my earlier blog post clearly explained how new cardinality estimation logic improves performance. If not, I suggest you watch following quick video where I explain this concept in extremely simple words. You can download the code used in this course from Simple Demo of New Cardinality Estimation Features of SQL Server 2014. Action Item Here are the blog posts I have previously written. You can read it over here: Simple Demo of New Cardinality Estimation Features of SQL Server 2014 Pluralsight Course You can subscribe to my YouTube Channel for frequent updates. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Video

    Read the article

  • Worldwide Web Camps

    - by ScottGu
    Over the next few weeks Microsoft is sponsoring a number of free Web Camp events around the world.  These provide a great way to learn about ASP.NET 4, ASP.NET MVC 2, and Visual Studio 2010. The Web Camps are two day events.  The camps aren’t conferences where you sit quietly for hours and people talk at you – they are intended to be interactive.  The first day is focused on learning through presentations that are heavy on coding demos.  The second day is focused on you building real applications using what you’ve learned.  The second day includes hands-on labs, and you’ll join small development teams with other attendees and work on a project together. We’ve got some great speakers lined up for the events – including Scott Hanselman, James Senior, Jon Galloway, Rachel Appel, Dan Wahlin, Christian Wenz and more.  I’ll also be presenting at one of the camps. Below is the schedule of the remaining events (the sold-out Toronto camp was a few days ago): Moscow May 19-19 Beijing May 21-22 Shanghai May 24-25 Mountain View May 27-28 Sydney May 28-29 Singapore June 04-05 London June 04-05 Munich June 07-08 Chicago June 11-12 Redmond, WA June 18-19 New York June 25-26 Many locations are sold out already but we still have some seats left in a few of them.  Registration and attendance to all of the events is completely free.  You can register to attend at www.webcamps.ms. Hope this helps, Scott

    Read the article

  • From Sea to Shining Fusion HCM Specialization

    - by Kristin Rose
    v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Well, the polls have closed, the votes are in and Oracle Fusion HCM Specialization is finally here! Not only is this Specialization easily achievable, partners are already seeing the “economic” value in it. But don’t just take our word for it, watch below as Oracle Diamond Partner, Infosys, shares their experience with Oracle Fusion HCM and all the success they’ve already seen! Here is how you can make a change and get started today: STEP 1: Join OPN STEP 2: Join Knowledge Zone STEP 3: Check Business and Competency Criteria STEP 4: Track Competency Status STEP 5: Apply Now So let’s put our differences aside, put Oracle Fusion first, and come together by learning more about this Oracle Fusion HCM Specialization.  We are OPN and we approve this message, The OPN Communications Team

    Read the article

  • Please help me, I need some solid career advice, put myself in a dumb situation

    - by Kevin
    Hi, First off, I just want to say thank you in advance for looking at my question and would really value your input on this subject. My core question is how do I proceed from the following predicament. I will be honest with you, I wasted my College Experience. I slacked off and didn't take any of my comp sci classes that seriously, somehow i still got out with a 3.25 GPA. But truth be told I learned nothing. I befriended most of my professors who went pretty lenient on me in terms of grading. However, I basically came out of College knowing how to program a simple calculator in VB.Net. I was (to my great surprise) hired by a very large respected company in Denver as a Junior developer. Well the long and the short of it is that I knew so little about programming that I quickly became the office pariah and was almost fired due to my incompetence. It has been 8 months now and I feel I have learned some basic things and I am not as picked on as I used to be by the other developers. However, everyone hates me and the first few months have given the other developers a horrible perception of me. I am no longer afraid of code or learning, but I have put my self in the precarious position of being the scapegoat of our department. I hate going to work every day because no one there is my friend and pretty much everyone is hostile to me. What should I do? Any advice?

    Read the article

  • New Whitepaper: Primer on Integrating with EBS 12 with Other Applications

    - by Rekha Ayothi
    Oracle E-Business Suite offers several integration points and a variety of integration technologies. While a given integration point may be available through various technologies and products, it is important to select the best approach for your specific integration requirements. I am pleased to announce the publication of a new white paper that can help with this: Oracle E-Business Suite Release 12.1.3 - Integration Products and Technologies Primer (Note 1494997.1) This whitepaper reviews integration strategies for Oracle E-Business Suite applications that are available today. The intended audience is solution architects, integration consultants, and anyone else interested in learning about integration options with Oracle E-Business Suite. The white paper outlines the following enterprise application integration styles: Data-centric integration Integration through native interfaces Process-centric integration Event-driven integration B2B integration Integration through web services  The white paper also discusses Oracle E-Business Suite application layer products and technologies that address the specific needs of each of these integration styles. It concludes with criteria for selecting the appropriate integration-related tools and technologies for your requirements. Attending OpenWorld 2012? We have two sessions covering Oracle E-Business Suite integration. Please join us to hear more on this subject: CON9005 - Oracle E-Business Suite Integration Best Practices ( Tuesday, Oct 2, 1:15 PM - 2:15 PM - Moscone West 2018) CON8716 - Web Services and SOA Integration Options for Oracle E-Business Suite ( Thursday, Oct 4, 11:15 AM - 12:15 PM - Moscone West 2016)  Related Articles E-Business Suite Technology Sessions at OpenWorld 2012 Webcast Replay Available: SOA Integration Options for E-Business Suite BPEL 11.1.1.6 Certified for Prebuilt E-Business Suite 12.1.3 SOA Integrations New Whitepaper: Defining Web Applications Desktop Integrators That Return Error Messages

    Read the article

  • Xamarin Designer for Android Article

    - by Wallym
    The latest version of Mono for Android includes a long-awaited design surface. Learn how it works.It's interesting to look at the needs of various segments of developers. When I first start looking at an environment, the first thing I need to understand is the UI. I'm not magically born with some knowledge about the environment and don't learn well by just reading, so I need some help in getting started. I found this was true when I started Windows based development in the early 1990s, Dynamic Web in the late 1990s, ASP.NET in 2000, Silverlight/WPF, iPhone and Android. I find that getting up to speed with a UI is the single biggest deterrent for someone learning a platform. I find that as a beginner I need the features provided by a design surface. It's only as I grow and become comfortable with a platform that I find that building a UI by hand is more productive. Even as I get more advanced, I still can learn from a designer, so it has value as I grow into a platform.I hope that this article helps you as you dive into Android Development.

    Read the article

  • PHP Browser Game Question - Pretty General Language Suitability and Approach Question

    - by JimBadger
    I'm developing a browser game, using PHP, but I'm unsure if the way I'm going about doing it is to be encouraged anymore. It's basically one of those MMOs where you level up various buildings and what have you, but, you then commit some abstract fighting entity that the game gives you, to an automated battle with another player (producing a textual, but hopefully amusing and varied combat report). Basically, as soon as two players agree to fight, PHP functions on the "fight.php" page run queries against a huge MySQL database, looking up all sorts of complicated fight moves and outcomes. There are about three hundred thousand combinations of combat stance, attack, move and defensive stances, so obviously this is quite a resource hungry process, and, on the super cheapo hosted server I'm using for development, it rapidly runs out of memory. The PHP script for the fight logic currently has about a thousand lines of code in it, and I'd say it's about half-finished as I try to add a bit of AI into the fight script. Is there a better way to do something this massive than simply having some functions in a PHP file calling the MySQL Database? I taught myself a modicum of PHP a while ago, and most of the stuff I read online (ages ago) about similar games was all PHP-based. but a) am I right to be using PHP at all, and b) am I missing some clever way of doing things that will somehow reduce server resource requirements? I'd consider non PHP alternatives but, if PHP is suitable, I'd rather stick to that, so there's no overhead of learning something new. I think I'd bite that bullet if it's the best option for a better game, though.

    Read the article

  • eSeminar: Oracle’s Fusion Update for Partners

    - by Richard Lefebvre
    Oracle’s Fusion Update for PartnersThursday, November 17th  - 6pm CET At OOW, Oracle unveiled Oracle Fusion Applications, the next generation of business applications. By setting the standard for application architecture, design and deployment, customers will be able to extend the value of their applications environment by using Oracle Fusion Applications components side-by-side with their existing applications portfolio. Delivered as a complete suite of modular applications, Oracle Fusion Applications coexist with existing Oracle Applications. As one module, a product family or the entire suite, customers can choose to leverage the advances pioneered by Oracle at a pace that matches business needs for a new level of performance. David Bowin, Director of Oracle’s Fusion Applications Team, will host a eSeminar sessions to address various questions that our partners have regarding Oracle’s Fusion Applications.   See the schedule below and mark your calendar to attend. 9:00am - 10:00am Pacific (6pm CET) Click this link to add the event to your calendar: http://oukc.oracle.com/static11/opn/ics/98300.icsDial-In:  1. 877-664-9137  /   Passcode 98300International:  706-634-9619  http://www.intercall.com/national/oracleuniversity/gdnam.html Access Live Event Learning Link:  http://oukc.oracle.com/static09/opn/login/?t=livewebcast|c=1069641479 Webconference access-- http://ouweb.webex.comSession number: 591807958 

    Read the article

  • OpenGL - Stack overflow if I do, Stack underflow if I don't!

    - by Wayne Werner
    Hi, I'm in a multimedia class in college, and we're "learning" OpenGL as part of the class. I'm trying to figure out how the OpenGL camera vs. modelview works, and so I found this example. I'm trying to port the example to Python using the OpenGL bindings - it starts up OpenGL much faster, so for testing purposes it's a lot nicer - but I keep running into a stack overflow error with the glPushMatrix in this code: def cube(): for x in xrange(10): glPushMatrix() glTranslated(-positionx[x + 1] * 10, 0, -positionz[x + 1] * 10); #translate the cube glutSolidCube(2); #draw the cube glPopMatrix(); According to this reference, that happens when the matrix stack is full. So I thought, "well, if it's full, let me just pop the matrix off the top of the stack, and there will be room". I modified the code to: def cube(): glPopMatrix() for x in xrange(10): glPushMatrix() glTranslated(-positionx[x + 1] * 10, 0, -positionz[x + 1] * 10); #translate the cube glutSolidCube(2); #draw the cube glPopMatrix(); And now I get a buffer underflow error - which apparently happens when the stack has only one matrix. So am I just waaay off base in my understanding? Or is there some way to increase the matrix stack size? Also, if anyone has some good (online) references (examples, etc.) for understanding how the camera/model matrices work together, I would sincerely appreciate them! Thanks!

    Read the article

  • Installing SharePoint 2010 and PowerPivot for SharePoint on Windows 7

    - by smisner
    Many people like me want (or need) to do their business intelligence development work on a laptop. As someone who frequently speaks at various events or teaches classes on all subjects related to the Microsoft business intelligence stack, I need a way to run multiple server products on my laptop with reasonable performance. Once upon a time, that requirement meant only that I had to load the current version of SQL Server and the client tools of choice. In today's post, I'll review my latest experience with trying to make the newly released Microsoft BI products work with a Windows 7 operating system. The entrance of Microsoft Office SharePoint Server 2007 into the BI stack complicated matters and I started using Virtual Server to establish a "suitable" environment. As part of the team that delivered a lot of education as part of the Yukon pre-launch activities (that would be SQL Server 2005 for the uninitiated), I was working with four - yes, four - virtual servers. That was a pretty brutal workload for a 2GB laptop, which worked if I was very, very careful. It could also be a finicky and unreliable configuration as I learned to my dismay at one TechEd session several years ago when I had to reboot a very carefully cached set of servers just minutes before my session started. Although it worked, it came back to life very, very slowly much to the displeasure of the audience. They couldn't possibly have been less pleased than me. At that moment, I resolved to get the beefiest environment I could afford and consolidate to a single virtual server. Enter the 4GB 64-bit laptop to preserve my sanity and my livelihood. Likewise, for SQL Server 2008, I managed to keep everything within a single virtual server and I could function reasonably well with this approach. Now we have SQL Server 2008 R2 plus Office SharePoint Server 2010. That means a 64-bit operating system. Period. That means no more Virtual Server. That means I must use Hyper-V or another alternative. I've heard alternatives exist, but my few dabbles in this area did not yield positive results. It might have been just me having issues rather than any failure of those technologies to adequately support the requirements. My first run at working with the new BI stack configuration was to set up a 64-bit 4GB laptop with a dual-boot to run Windows Server 2008 R2 with Hyper-V. However, I was generally not happy with running Windows Server 2008 R2 on my laptop. For one, I couldn't put it into sleep mode, which is helpful if I want to prepare for a presentation beforehand and then walk to the podium without the need to hold my laptop in its open state along the way (my strategy at the TechEd session long, long ago). Secondly, it was finicky with projectors. I had issues from time to time and while I always eventually got it to work, I didn't appreciate those nerve-wracking moments wondering whether this would be the time that it wouldn't work. Somewhere along the way, I learned that it was possible to load SharePoint 2010 in a Windows 7 which piqued my interest. I had just acquired a new laptop running Windows 7 64-bit, and thought surely running the BI stack natively on my laptop must be better than running Hyper-V. (I have not tried booting to Hyper-V VHD yet, but that's on my list of things to try so the jury of one is still out on this approach.) Recently, I had to build up a server with the RTM versions of SQL Server 2008 R2 and Sharepoint Server 2010 and decided to follow suit on my Windows 7 Ultimate 64-bit laptop. The process is slightly different, but I'm happy to report that it IS possible, although I had some fits and starts along the way. DISCLAIMER: These products are NOT intended to be run in production mode on the Windows 7 operating system. The configuration described in this post is strictly for development or learning purposes and not supported by Microsoft. If you have trouble, you will NOT get help from them. I might be able to help, but I provide no guarantees of my ability or availablity to help. I won't provide the step-by-step instructions in this post as there are other resources that provide these details, but I will provide an overview of my approach, point you to the relevant resources, describe some of the problems I encountered, and explain how I addressed those problems to achieve my desired goal. Because my goal was not simply to set up SharePoint Server 2010 on my laptop, but specifically PowerPivot for SharePoint, I started out by referring to the installation instructions at the PowerPiovt-Info site, but mainly to confirm that I was performing steps in the proper sequence. I didn't perform the steps in Part 1 because those steps are applicable only to a server operating system which I am not running on my laptop. Then, the instructions in Part 2, won't work exactly as written for the same reason. Instead, I followed the instructions on MSDN, Setting Up the Development Environment for SharePoint 2010 on Windows Vista, Windows 7, and Windows Server 2008. In general, I found the following differences in installation steps from the steps at PowerPivot-Info: You must copy the SharePoint installation media to the local drive so that you can edit the config.xml to allow installation on a Windows client. You also have to manually install the prerequisites. The instructions provides links to each item that you must manually install and provides a command-line instruction to execute which enables required Windows features. I will digress for a moment to save you some grief in the sequence of steps to perform. I discovered later that a missing step in the MSDN instructions is to install the November CTP Reporting Services add-in for SharePoint. When I went to test my SharePoint site (I believe I tested after I had a successful PowerPivot installation), I ran into the following error: Could not load file or assembly 'RSSharePointSoapProxy, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified. I was rather surprised that Reporting Services was required. Then I found an article by Alan le Marquand, Working Together: SQL Server 2008 R2 Reporting Services Integration in SharePoint 2010,that instructed readers to install the November add-in. My first reaction was, "Really?!?" But I confirmed it in another TechNet article on hardware and software requirements for SharePoint Server 2010. It doesn't refer explicitly to the November CTP but following the link took me there. (Interestingly, I retested today and there's no longer any reference to the November CTP. Here's the link to download the latest and greatest Reporting Services Add-in for SharePoint Technologies 2010.) You don't need to download the add-in anymore if you're doing a regular server-based installation of SharePoint because it installs as part of the prerequisites automatically. When it was time to start the installation of SharePoint, I deviated from the MSDN instructions and from the PowerPivot-Info instructions: On the Choose the installation you want page of the installation wizard, I chose Server Farm. On the Server Type page, I chose Complete. At the end of the installation, I did not run the configuration wizard. Returning to the PowerPivot-Info instructions, I tried to follow the instructions in Part 3 which describe installing SQL Server 2008 R2 with the PowerPivot option. These instructions tell you to choose the New Server option on the Setup Role page where you add PowerPivot for SharePoint. However, I ran into problems with this approach and got installation errors at the end. It wasn't until much later as I was investigating an error that I encountered Dave Wickert's post that installing PowerPivot for SharePoint on Windows 7 is unsupported. Uh oh. But he did want to hear about it if anyone succeeded, so I decided to take the plunge. Perseverance paid off, and I can happily inform Dave that it does work so far. I haven't tested absolutely everything with PowerPivot for SharePoint but have successfully deployed a workbook and viewed the PowerPivot Management Dashboard. I have not yet tested the data refresh feature, but I have installed. Continue reading to see how I accomplished my objective. I unintalled SQL Server 2008 R2 and started again. I had different problems which I don't recollect now. However, I uninstalled again and approached installation from a different angle and my next attempt succeeded. The downside of this approach is that you must do all of the things yourself that are done automatically when you install PowerPivot as a new server. Here are the steps that I followed: Install SQL Server 2008 R2 to get a database engine instance installed. Run the SharePoint configuration wizard to set up the SharePoint databases. In Central Administration, create a Web application using classic mode authentication as per a TechNet article on PowerPivot Authentication and Authorization. Then I followed the steps I found at How to: Install PowerPivot for SharePoint on an Existing SharePoint Server. Especially important to note - you must launch setup by using Run as administrator. I did not have to manually deploy the PowerPivot solution as the instructions specify, but it's good to know about this step because it tells you where to look in Central Administration to confirm a successful deployment. I did spot some incorrect steps in the instructions (at the time of this writing) in How To: Configure Stored Credentials for PowerPivot Data Refresh. Specifically, in the section entitled Step 1: Create a target application and set the credentials, both steps 10 and 12 are incorrect. They tell you to provide an actual Windows user name and password on the page where you are simply defining the prompts for your application in the Secure Store Service. To add the Windows user name and password that you want to associate with the application - after you have successfully created the target application - you select the target application and then click Set credentials in the ribbon. Lastly, I followed the instructions at How to: Install Office Data Connectivity Components on a PowerPivot server. However, I have yet to test this in my current environment. I did have several stops and starts throughout this process and edited those out to spare you from reading non-essential information. I believe the explanation I have provided here accurately reflect the steps I followed to produce a working configuration. If you follow these steps and get a different result, please let me know so that together we can work through the issue and correct these instructions. I'm sure there are many other folks in the Microsoft BI community that will appreciate the ability to set up the BI stack in a Windows 7 environment for development or learning purposes. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Scene Graph for Deferred Rendering Engine

    - by Roy T.
    As a learning exercise I've written a deferred rendering engine. Now I'd like to add a scene graph to this engine but I'm a bit puzzled how to do this. On a normal (forward rendering engine) I would just add all items (All implementing IDrawable and IUpdateAble) to my scene graph, than travel the scene-graph breadth first and call Draw() everywhere. However in a deferred rendering engine I have to separate draw calls. First I have to draw the geometry, then the shadow casters and then the lights (all to different render targets), before I combine them all. So in this case I can't just travel over the scene graph and just call draw. The way I see it I either have to travel over the entire scene graph 3 times, checking what kind of object it is that has to be drawn, or I have to create 3 separate scene graphs that are somehow connected to each other. Both of these seem poor solutions, I'd like to handle scene objects more transparent. One other solution I've thought of was traveling trough the scene graph as normal and adding items to 3 separate lists, separating geometry, shadow casters and lights, and then iterating these lists to draw the correct stuff, is this better, and is it wise to repopulate 3 lists every frame?

    Read the article

  • Computer Science Degree or Computer Engineering Degree?

    - by Paul
    Hello everyone, I'm 23 years old living in Italy and this year I will be getting my high school diploma. I'm interested in pursuing a collage degree and work in the IT field. At the moment I'm self teaching myself Java (I also know python, html, css and mysql). I'm also learning about algorithms and OO design. I'm curious how important a college degree is for me, considering my age and if there is a big difference between computer science and computer engineer. There is a computer science university where I currently live but not a computer engineer one. For some reason universities that offer computer engineering courses are only in bigger cities such as Milan, Bologna, Roma. Cost wise, it would be cheaper for me to study near home at a computer science school. Career wise, would a computer engineering university offer me more work opportunities instead of a computer science degree ? Is it easier transiting from CS to CEN or vice-versa? I'm not exactly sure what type of job I want to pursue in the future since I'm still a bit undecided but definitely not system/network administrator, database administrator, game developer.

    Read the article

  • Going to Tech*Ed 2010

    - by Nikita Polyakov
    After years of one night community and volunteering tasks; and even running cool events like ]inbetween[ weekend, I finally get to go to the actual event! And this time it’s not in Orlando – it’s in New Orleans - which is also exciting! I will be attending many Windows Phone 7 sessions. And will hover over the Windows Phone booths. I am also extremely excited about this short exchange I had on twitter with the brand new Windows Phone Partner Community account: @wppartner #WindwosPhone 7 Enterprise story is what we are all waiting to hear :) #wp7dev 7:51 PM Jun 1st via TweetDeck in reply to wppartner @NikitaP We'll definitely be covering that at @WPCDC but we'll also be talking about it at @TechEd_NA next week! about 4 hours ago via CoTweet in reply to NikitaP As you might know I also love Microsoft Expression Blend and SketchFlow. I will be hanging out at the Microsoft Expression TLC [Technical Learning Center] booths in Expo Hall during these times: Day Start Finish 7-Jun 10:30 AM 12:30 PM 7-Jun 7:30 PM 9:00 PM 8-Jun 2:45 PM 5:00 PM 9-Jun 2:45 PM 5:00 PM 10-Jun 12:15 PM 3:00 PM   Feel free to find me and chat me up. I’ll be twittering under @NikitaP, if you are in Florida dev community use #teched_fl hash tag. If you are going and you have a Windows Phone 6.5, iPhone/ipad, Android or a Vista/Win7 laptop with you, grab this: Kevin Wolf’s TechEd 2010 Schedule and Twitter Tool – One App, 5 Different Platforms in one word: Aaaaaaamazing!

    Read the article

  • Desktop Fun: Merry Christmas Icon Packs

    - by Asian Angel
    Christmas is getting closer, so it is time to start decorating your desktops! Today we have a collection of fun and colorful Merry Christmas icons to help get you and your desktop ready for the holidays. Note: To customize the icon setup on your Windows 7 & Vista systems see our article here. Using Windows XP? We have you covered here. Sneak Preview Here is the holiday desktop that we put together using the Standard Christmas Icons 2010.1 pack shown below. Note: The original, unmodified version of this wallpaper can be found here. A closer look at the fun icons we used on our desktop… The Icon Packs Charlie Brown Christmas *.ico format only Download Frosty the Snowman 1.0 *.ico format only Download Winter Icons 1.0 *.ico format only Download Christmas Icons Set 1 1.0 *.ico format only Download Christmas Icons Set 2 1.0 *.ico format only Download Wreaths Icons 1.0 *.ico format only Download SketchCons Christmas *.ico format only Download Standard Christmas Icons 2010.1 *.ico, .png, .bmp, and .gif format Download Christmas Icons *.ico format only Download Christmas *.ico, .png, and .icns format Download Silent Night *.png format only Download My Christmas 1.0 *.ico and .png format Download Xmas Festival *.png format only Download Xmas Stickers *.png format only Download Winter Wonderland *.ico format only Download Wanting more great icon sets to look through? Be certain to visit our Desktop Fun section for more icon goodness! Latest Features How-To Geek ETC The Complete List of iPad Tips, Tricks, and Tutorials The 50 Best Registry Hacks that Make Windows Better The How-To Geek Holiday Gift Guide (Geeky Stuff We Like) LCD? LED? Plasma? The How-To Geek Guide to HDTV Technology The How-To Geek Guide to Learning Photoshop, Part 8: Filters Improve Digital Photography by Calibrating Your Monitor The Brothers Mario – Epic Gangland Style Mario Brothers Movie Trailer [Video] Score Awesome Games on the Cheap with the Humble Indie Bundle Add a Colorful Christmas Theme to Your Windows 7 Desktop This Windows Hack Changes the Blue Screen of Death to Red Edit Images Quickly in Firefox with Pixlr Grabber Zoho Writer, Sheet, and Show Now Available in Chrome Web Store

    Read the article

  • Ruby but not Rails on my Resume

    - by Ken Bloom
    I have listed Ruby as a skill on my resume becuase I've been programming in Ruby for 5 years while I work on my Ph.D. thesis. I've mostly been using it to implement natural language processing algorithms. I'm starting to look for a job, and I posted my resume to a few sites (as an extra bonus when applying to certain on-target jobs). Now I get recruiters calling me to offer me Ruby on Rails jobs. The problem is that I've never learned Rails. It was never relevant to what I'm doing for my Ph.D. How do you recommend handling this situation to avoid wasting my time and theirs? (And learning Rails probably isn't an option until I finish my thesis.) Can my resume be adjusted to make this clearer? Should it be adjusted? Should I just politely tell them on the phone that I don't know Rails? By the way, the relevant part of my resume simply says: Skills: Programming Languages: C, C++, Java, Scala, Ruby, LaTeX Databases: MySQL, XML, XPath and lists a few other skill areas that couldn't possibly be confused with a Rails developer.

    Read the article

  • Friday Fun: Christmas Tree Light Up

    - by Asian Angel
    Another week has thankfully passed by, so it is time to take a break and have some fun. This week’s game tests your ability to light up the whole Christmas tree…can you figure out the correct wiring configuration? Christmas Tree Light Up The object of the game is simple…light up all of the bulbs on the Christmas tree. While the game may look quick and easy at first you will need to do some thinking and experimenting to come up with the correct wiring configuration. The instructions are very simple…just click on any of the wiring sections or bulbs to rotate them. Keep in mind that you may have to click a few times to line the wiring sections or bulbs up as desired since the rotation is always clockwise. Note: You will need use all of the wiring sections available to completely light the tree up. Each time you will be presented with a different starting setup coming from your power source. Time to hook up the lights! Note: It is recommended that you disable the sound for the game since the “rotation” sounds can be slightly irritating. A nice start but there are still a lot of bulbs to light up. Getting closer… Almost there…only two more bulbs to light up. Success! Have fun playing! Play Christmas Tree Light Up Latest Features How-To Geek ETC The 50 Best Registry Hacks that Make Windows Better The How-To Geek Holiday Gift Guide (Geeky Stuff We Like) LCD? LED? Plasma? The How-To Geek Guide to HDTV Technology The How-To Geek Guide to Learning Photoshop, Part 8: Filters Improve Digital Photography by Calibrating Your Monitor Our Favorite Tech: What We’re Thankful For at How-To Geek Settle into Orbit with the Voyage Theme for Chrome and Iron Awesome Safari Compass Icons Set Escape from the Exploding Planet Wallpaper Move Your Tumblr Blog to WordPress Pytask is an Easy to Use To-Do List Manager for Your Ubuntu System Snowy Christmas House Personas Theme for Firefox

    Read the article

  • Building Simple Workflows in Oozie

    - by dan.mcclary
    Introduction More often than not, data doesn't come packaged exactly as we'd like it for analysis. Transformation, match-merge operations, and a host of data munging tasks are usually needed before we can extract insights from our Big Data sources. Few people find data munging exciting, but it has to be done. Once we've suffered that boredom, we should take steps to automate the process. We want codify our work into repeatable units and create workflows which we can leverage over and over again without having to write new code. In this article, we'll look at how to use Oozie to create a workflow for the parallel machine learning task I described on Cloudera's site. Hive Actions: Prepping for Pig In my parallel machine learning article, I use data from the National Climatic Data Center to build weather models on a state-by-state basis. NCDC makes the data freely available as gzipped files of day-over-day observations stretching from the 1930s to today. In reading that post, one might get the impression that the data came in a handy, ready-to-model files with convenient delimiters. The truth of it is that I need to perform some parsing and projection on the dataset before it can be modeled. If I get more observations, I'll want to retrain and test those models, which will require more parsing and projection. This is a good opportunity to start building up a workflow with Oozie. I store the data from the NCDC in HDFS and create an external Hive table partitioned by year. This gives me flexibility of Hive's query language when I want it, but let's me put the dataset in a directory of my choosing in case I want to treat the same data with Pig or MapReduce code. CREATE EXTERNAL TABLE IF NOT EXISTS historic_weather(column 1, column2) PARTITIONED BY (yr string) STORED AS ... LOCATION '/user/oracle/weather/historic'; As new weather data comes in from NCDC, I'll need to add partitions to my table. That's an action I should put in the workflow. Similarly, the weather data requires parsing in order to be useful as a set of columns. Because of their long history, the weather data is broken up into fields of specific byte lengths: x bytes for the station ID, y bytes for the dew point, and so on. The delimiting is consistent from year to year, so writing SerDe or a parser for transformation is simple. Once that's done, I want to select columns on which to train, classify certain features, and place the training data in an HDFS directory for my Pig script to access. ALTER TABLE historic_weather ADD IF NOT EXISTS PARTITION (yr='2010') LOCATION '/user/oracle/weather/historic/yr=2011'; INSERT OVERWRITE DIRECTORY '/user/oracle/weather/cleaned_history' SELECT w.stn, w.wban, w.weather_year, w.weather_month, w.weather_day, w.temp, w.dewp, w.weather FROM ( FROM historic_weather SELECT TRANSFORM(...) USING '/path/to/hive/filters/ncdc_parser.py' as stn, wban, weather_year, weather_month, weather_day, temp, dewp, weather ) w; Since I'm going to prepare training directories with at least the same frequency that I add partitions, I should also add that to my workflow. Oozie is going to invoke these Hive actions using what's somewhat obviously referred to as a Hive action. Hive actions amount to Oozie running a script file containing our query language statements, so we can place them in a file called weather_train.hql. Starting Our Workflow Oozie offers two types of jobs: workflows and coordinator jobs. Workflows are straightforward: they define a set of actions to perform as a sequence or directed acyclic graph. Coordinator jobs can take all the same actions of Workflow jobs, but they can be automatically started either periodically or when new data arrives in a specified location. To keep things simple we'll make a workflow job; coordinator jobs simply require another XML file for scheduling. The bare minimum for workflow XML defines a name, a starting point, and an end point: <workflow-app name="WeatherMan" xmlns="uri:oozie:workflow:0.1"> <start to="ParseNCDCData"/> <end name="end"/> </workflow-app> To this we need to add an action, and within that we'll specify the hive parameters Also, keep in mind that actions require <ok> and <error> tags to direct the next action on success or failure. <action name="ParseNCDCData"> <hive xmlns="uri:oozie:hive-action:0.2"> <job-tracker>localhost:8021</job-tracker> <name-node>localhost:8020</name-node> <configuration> <property> <name>oozie.hive.defaults</name> <value>/user/oracle/weather_ooze/hive-default.xml</value> </property> </configuration> <script>ncdc_parse.hql</script> </hive> <ok to="WeatherMan"/> <error to="end"/> </action> There are a couple of things to note here: I have to give the FQDN (or IP) and port of my JobTracker and NameNode. I have to include a hive-default.xml file. I have to include a script file. The hive-default.xml and script file must be stored in HDFS That last point is particularly important. Oozie doesn't make assumptions about where a given workflow is being run. You might submit workflows against different clusters, or have different hive-defaults.xml on different clusters (e.g. MySQL or Postgres-backed metastores). A quick way to ensure that all the assets end up in the right place in HDFS is just to make a working directory locally, build your workflow.xml in it, and copy the assets you'll need to it as you add actions to workflow.xml. At this point, our local directory should contain: workflow.xml hive-defaults.xml (make sure this file contains your metastore connection data) ncdc_parse.hql Adding Pig to the Ooze Adding our Pig script as an action is slightly simpler from an XML standpoint. All we do is add an action to workflow.xml as follows: <action name="WeatherMan"> <pig> <job-tracker>localhost:8021</job-tracker> <name-node>localhost:8020</name-node> <script>weather_train.pig</script> </pig> <ok to="end"/> <error to="end"/> </action> Once we've done this, we'll copy weather_train.pig to our working directory. However, there's a bit of a "gotcha" here. My pig script registers the Weka Jar and a chunk of jython. If those aren't also in HDFS, our action will fail from the outset -- but where do we put them? The Jython script goes into the working directory at the same level as the pig script, because pig attempts to load Jython files in the directory from which the script executes. However, that's not where our Weka jar goes. While Oozie doesn't assume much, it does make an assumption about the Pig classpath. Anything under working_directory/lib gets automatically added to the Pig classpath and no longer requires a REGISTER statement in the script. Anything that uses a REGISTER statement cannot be in the working_directory/lib directory. Instead, it needs to be in a different HDFS directory and attached to the pig action with an <archive> tag. Yes, that's as confusing as you think it is. You can get the exact rules for adding Jars to the distributed cache from Oozie's Pig Cookbook. Making the Workflow Work We've got a workflow defined and have collected all the components we'll need to run. But we can't run anything yet, because we still have to define some properties about the job and submit it to Oozie. We need to start with the job properties, as this is essentially the "request" we'll submit to the Oozie server. In the same working directory, we'll make a file called job.properties as follows: nameNode=hdfs://localhost:8020 jobTracker=localhost:8021 queueName=default weatherRoot=weather_ooze mapreduce.jobtracker.kerberos.principal=foo dfs.namenode.kerberos.principal=foo oozie.libpath=${nameNode}/user/oozie/share/lib oozie.wf.application.path=${nameNode}/user/${user.name}/${weatherRoot} outputDir=weather-ooze While some of the pieces of the properties file are familiar (e.g., JobTracker address), others take a bit of explaining. The first is weatherRoot: this is essentially an environment variable for the script (as are jobTracker and queueName). We're simply using them to simplify the directives for the Oozie job. The oozie.libpath pieces is extremely important. This is a directory in HDFS which holds Oozie's shared libraries: a collection of Jars necessary for invoking Hive, Pig, and other actions. It's a good idea to make sure this has been installed and copied up to HDFS. The last two lines are straightforward: run the application defined by workflow.xml at the application path listed and write the output to the output directory. We're finally ready to submit our job! After all that work we only need to do a few more things: Validate our workflow.xml Copy our working directory to HDFS Submit our job to the Oozie server Run our workflow Let's do them in order. First validate the workflow: oozie validate workflow.xml Next, copy the working directory up to HDFS: hadoop fs -put working_dir /user/oracle/working_dir Now we submit the job to the Oozie server. We need to ensure that we've got the correct URL for the Oozie server, and we need to specify our job.properties file as an argument. oozie job -oozie http://url.to.oozie.server:port_number/ -config /path/to/working_dir/job.properties -submit We've submitted the job, but we don't see any activity on the JobTracker? All I got was this funny bit of output: 14-20120525161321-oozie-oracle This is because submitting a job to Oozie creates an entry for the job and places it in PREP status. What we got back, in essence, is a ticket for our workflow to ride the Oozie train. We're responsible for redeeming our ticket and running the job. oozie -oozie http://url.to.oozie.server:port_number/ -start 14-20120525161321-oozie-oracle Of course, if we really want to run the job from the outset, we can change the "-submit" argument above to "-run." This will prep and run the workflow immediately. Takeaway So, there you have it: the somewhat laborious process of building an Oozie workflow. It's a bit tedious the first time out, but it does present a pair of real benefits to those of us who spend a great deal of time data munging. First, when new data arrives that requires the same processing, we already have the workflow defined and ready to run. Second, as we build up a set of useful action definitions over time, creating new workflows becomes quicker and quicker.

    Read the article

< Previous Page | 151 152 153 154 155 156 157 158 159 160 161 162  | Next Page >