Search Results

Search found 3788 results on 152 pages for 'life and annuities'.

Page 70/152 | < Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >

  • "Give with Bing" - Help raise money for Sports relief while searching for whatever you want

    - by Testas
    While Sport Relief drives fundraising by challenging people to do physical activities such as running a mile, we’re introducing the ‘Bing Search Mile,’ which gives people the ability to search using Bing and raise money for charity. For every 10 searches made, Bing.com will donate 5p to Sport Relief 2010, enabling you, and your friends and family, to raise money just by searching with Bing until the end of March. With the average mile taking about 10 minutes to run, in the same time, you can make up to 150 searches online - that’s 75p raised for a good cause per ‘search mile’. And while you’re at it,  why not step it up a gear and aim to complete a ‘Search Mile’ each day or even a ‘Search Marathon’ over the 5 week campaign with your colleagues, friends and family? How to get involved: 1.      Visit GiveWithBing.com and download the Official Sport Relief Bing Counter. Once downloaded, the Sport Relief counter will count all the searches you do on Bing from that point on.  2.      Now that you’re registered (and signed in), invite your friends, family, colleagues or classmates to join in the fundraising with you – GiveWithBing.com automatically generates an email explaining how it works for you to send them – the more people who search with you, the more money you raise. People can also register a school 3.      Run your ‘search mile’ every day and watch how your searches turn into life-changing cash for charity, with every 10 searches equalling 5p for Sport Relief. You can check your progress by visiting your individual page (more info here).  This is such a positive initiative and I challenge everyone in the UK to invite their key contacts to be part of Give with Bing.   Chris

    Read the article

  • IIM Calcutta &ndash; EPBM 14 &ndash; Campus Visit &ndash; Day 4 &ndash; Managing Self, SDCC and Gari

    - by Ram Shankar Yadav
    …I’m becoming more of an addict of writing about my experiences, here in Kolkata! Today we started a bit of late at around 8 AM, did our breakfast and reached in class a bit late at around 9:50 AM, and here goes the surprise…. Today we had two lectures on “Managing Self” and two lectures on “Sustainable Development and Climate Change” (SDCC). Some of us got few self discipline lessons the moment they entered class and asked to “get out” :D So frankly speaking it was a nostalgic moment, which reminded us of Collage ;) We did a FIRO-B test as and got very good tips on managing ourselves and differentiate between “manager” and “leader” After the lunch we had our session on SDCC, in which the prof started by explaining “Credit Crisis”, and moved on to Sustainable Development and few great examples from industry and life~! After the class we went for shopping at Garihat Market, we ate Pani Puri and Moodi :P ..one more surprise…my room got flooded with lot of my new ePBM friends to take the copy of the pictures and we did lot of chit chat around anything and everything :D …so far so good…it’s an amazing experience for me and hopefully for others….who came out of their daily chores and went back to the nostalgic lanes of friendship, learning and most importantly “Happyness” ~~~ Cheers, ram :) EPBM 14 pics : http://epbm14.shutterfly.com/pictures

    Read the article

  • Does a 77 Year Old Person Like To Use iPhone Siri? Of course!

    - by Gopinath
    When Apple releases any product, they just work irrespective of age, capability and ability of the users. It’s in the DNA of Steve Jobs and his colleagues at Apple to build products that just work with out any learning curve. The recent iPhone is loaded with Siri, an intelligent personal assistant. But can a 77 year old person quickly learn to use Siri for his day to day activities? Lets hear from a son who trained his 77 year old dad to use Siri on iPhone He caught on much faster than I thought he might. I was feeling proud of him and believed Siri would be a real productivity help in his life — seeing that, at 77, my dad still works full time as a realtor. I was encouraged that he really liked and would use his new personal assistant. Or at least I was until my mom called later that night. "Your father and I were just practicing with his new phone," Sigh. Well Siri will be great for my dad…if and when he remembers how to find her. Apple products are not for just techies like Android mobiles, they are for everyone. You can read the full story over here This article titled,Does a 77 Year Old Person Like To Use iPhone Siri? Of course!, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • Announcing Fusion Middleware Innovation Awards

    - by Michelle Kimihira
    Author: Neela Chaudhari Every year at OpenWorld, Oracle announces the winners to its most prestigious awards in Middleware, the Fusion Middleware Innovation Awards. This year, we’ll be announcing the winners and highlighting a few of their original implementations during this key session in the Middleware stream: 11:45 AM on Tuesday, October 2nd, CON9162 Oracle Fusion Middleware: Meet This Year's Most Impressive Customer Projects in Moscone West, 3001. In addition, we’ll give a sneak peak of a few winners during GEN9394: Fusion Middleware General Session with Hasan Rizvi at 10:15 AM on Tuesday, October 2nd in Moscone West, Hall D! What kinds of customers win the Fusion Middleware Innovation Awards? Winners are selected based on the uniqueness of their business case, business benefits, level of impact relative to the size of the organization, complexity and magnitude of implementation, and the originality of architecture. The winners are selected by a panel of judges that score each entry across multiple different scoring categories. This year, the following categories included: Oracle Exalogic Cloud Application Foundation Service Integration (SOA) and BPM WebCenter Identity Management Data Integration Application Development Framework and Fusion Development Business Analytics (BI, EPM and Exalytics) Last year at OpenWorld 2011 we had standing room only in our session, so come early!  We had over 30 innovative customers that won the award, including companies like BT, Choice Hotels, Electronic Arts, Clorox Company, ING, Dunkin Brands, Telenor, Haier, AT&T, Manpower, Herbal Life and many others. Did you miss your chance this year to nominate your company? Come join with us in the awards session to get an edge in your next year’s submission and watch for the next opportunity for 2013 on this blog. There’s other awards as part of Oracle’s Excellence awards program or subscribe to our regular Fusion Middleware newsletter to get the latest reminders.

    Read the article

  • BizTalk: mapping with Xslt

    - by Leonid Ganeline
    BizTalk Map Editor (Mapper) is a good editor, especially in the last 2010 version of the BizTalk. But still sometimes it cannot do the tasks easily. It is time for the Xslt code, It is time to remember that the maps are executed by the Xslt engine.  Right-click the Mapper Grid (a field between the source and target schemas) and choose Properties /Custom XSLT Path.  Input here a name of the file with Xslt code. Only this code will be executed, forget the picture in the Mapper, all those links and functoids.  Let’s see the real-life example. There are two source Addresses. One is on the top level and the second is inside the Member_Address record with MaxOccurs=* . The target address is placed inside the Locator record with MaxOccurs=*. The requirement is to map all source address to the one target address structure. The source Xml document looks like: The result Xml should be like this: Try to do this mapping with the Mapper and you will spent good amount of time and the result map would be tricky. If we use the Xslt code, the mapping will be simple and unambiguous, like this: Simple, elegant.

    Read the article

  • An introductory presentation about testing with MSTest, Visual Studio, and Team Foundation Server 2010

    - by Thomas Weller
    While it was very quiet here on my blog during the last months, this was not at all true for the rest of my professional life. The simple story is that I was too busy to find the time for authoring blog posts (and you might see from my previous ones that they’re usually not of the ‘Hey, I’m currently reading X’ or ‘I’m currently thinking about Y’ kind…). Anyway. Among the things I did during the last months were setting up a TFS environment (2010) and introducing a development team to the MSTest framework (aka. Visual Studio Unit Testing), some additional tools (e.g. Moq, Moles, White),  how this is supported in Visual Studio, and how it integrates into the broader context of the then new TFS environment. After wiping out all the stuff which was directly related to my former customer and reviewing/extending the Speaker notes, I thought I share this presentation (via Slideshare) with the rest of the world. Hopefully it can be useful to someone else out there… Introduction to testing with MSTest, Visual Studio, and Team Foundation Server 2010 View more presentations from Thomas Weller. Be sure to also check out the slide notes (either by viewing the presentation directly on Slideshare or - even better - by downloading it). They contain quite some additional information, hints, and (in my opinion) best practices.

    Read the article

  • Do I need path finding to make AI avoid obstacles?

    - by yannicuLar
    How do you know when a path-finding algorithm is really needed? There are contexts, where you just want to improve AI navigation to avoid an object, like a space -ship that won't crash on a planet or a car that already knows where to steer, but needs small corrections to avoid a road bump. As I've seen on similar posts, the obvious solution is to implement some path-finding algorithm, most likely like A*, and let your AI-controlled object to navigate through the path. Now, I have the necessary skills to implement a path-finding algorithm, and I'm not being lazy here, but I'm still a bit skeptical on if this is really needed. I have the impression that path-finding is appropriate to navigate through a maze, or picking a path when there are many alternatives. But in obstacle avoidance, when you do know the path, but need to make slight corrections, is path finding really necessary? Even when the obstacles are too sparse or small ? I mean, in real life, when you're driving and notice a bump on the road, you will just have to pick between steering a bit on the left (and have the bump on your right side) or the other way around. You will not consider stopping, or going backwards. A path finding would be appropriate when you need to pick a route through the city, right ? So, are there any other methods to help AI navigation, except path-finding? And if there are, how do you know when path-fining is the appropriate algorithm ? Thanks for any thoughts

    Read the article

  • Did you forget me?

    - by Ratman21
    I know it has been a long time since I last blogged. Still at it, looking for work in the “IT” field. Had another phone interview (only found out during the interview that it was for one year contract job, but I still would take it) for a Help Desk job. Didn’t get it, they thought I was not a application support person and more of a hardware support. Gee..I started out in “IT” as a programmer. Then a programmer/computer operator, then a Tandem/Lan operator and finally a Network operator. I had to deal with so many different operating systems, software and applications.   And they thought I was too hardware. Well I am working a temp day job with the U.S. Census. It gets me out of the house and out in the country. If find getting paid to check for living quarters not bad job, except for the many houses I find that are up for sale and looks like it was not the owners (former owners it seems) idea, with the kids toys still in the yards. Not good for some one with a over active imagine or for my truck. So far I have backed in to ditch (and had to be pulled out), in to power pole (no damage to pole and very little to truck) and a mail box (no damage to truck but mail box was leaning a little) in the last two weeks.   Oh an I have started reading/using “The Love Dare” book from the movie “Fireproof”. I restarted (yes I have had to go back to day one from day five) the dare this Sunday. Dare one dare/day one “Love Is Patient” and the first dare is (reading from the book is): “The first part of this dare is fairly simple. Although Love is communicated in a number of ways. Our words often reflect the condition of our heart. For the next day, resolve to demonstrate patience and to sys nothing negative to your spouse at all. If the temptation arises, choose not to say anything. It’s better to hold your tongue that to say something you’ll regret. “. This was almost too easy as I can hold back from saying anything bad to any one but, this can also be a problem in life (you hold back for so long and!!!!!!!!!!!!!! Boom). Check back for dare/day two “Love Is Kind”.

    Read the article

  • SQLAuthority News – Presenting at Virtual Tech Days TechEd Pre-Con – February 9, 2011

    - by pinaldave
    I will be presenting on following subject on Virtual Tech Days TechEd Pre-Con – February 9, 2011. Auditing Made Easy: Change Tracking and Change Data Capture Date and Time: February 9, 2011 11:45am-12:45pm Location: Online In this fast paced demo oriented session we will go over few of concept which are related to real life problem at customers. We often see developers and DBA looking for details like who has dropped the table, who has last modified any object as well what was actually modified. SQL Server 2008 has all the answers. It has various new methods for Auditing where not only you can know details about what was changed as well know who changed it as well. In addition to that we can capture way more details configuring Auditing. We can also work prevent changes if proper policy management is configured. If you have ever attended my session on this subject earlier, this is going to absolutely new session and very much demo oriented. There is going to be quiz at the end of the session and I promise that if you attend the session, you will get all the answers correct. Reference: Pinal Dave (http://blog.SQLAuthority.com)   Filed under: About Me, Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • Microsoft&rsquo;s new technical computing initiative

    - by Randy Walker
    I made a mental note from earlier in the year.  Microsoft literally buys computers by the truckload.  From what I understand, it’s a typical practice amongst large software vendors.  You plug a few wires in, you test it, and you instantly have mega tera tera flops (don’t hold me to that number).  Microsoft has been trying to plug away at their cloud services (named Azure).  Which, for the layman, means Microsoft runs your software on their computers, and as demand increases you can allocate more computing power on the fly. With this in mind, it doesn’t surprise me that I was recently sent an executive email concerning Microsoft’s new technical computing initiative.  I find it to be a great marketing idea with actual substance behind their real work.  From the programmer academic perspective, in college we dreamed about this type of processing power.  This has decades of computer science theory behind it. A copy of the email received.  (note that I almost deleted this email, thinking it was spam due to it’s length) We don't often think about how complex life really is. Take the relatively simple task of commuting to and from work: it is, in fact, a complicated interplay of variables such as weather, train delays, accidents, traffic patterns, road construction, etc. You can however, take steps to shorten your commute - using a good, predictive understanding of a few of these variables. In fact, you probably are already taking these inputs and instinctively building a predictive model that you act on daily to get to your destination more quickly. Now, when we apply the same method to very complex tasks, this modeling approach becomes much more challenging. Recent world events clearly demonstrated our inability to process vast amounts of information and variables that would have helped to more accurately predict the behavior of global financial markets or the occurrence and impact of a volcano eruption in Iceland. To make sense of issues like these, researchers, engineers and analysts create computer models of the almost infinite number of possible interactions in complex systems. But, they need increasingly more sophisticated computer models to better understand how the world behaves and to make fact-based predictions about the future. And, to do this, it requires a tremendous amount of computing power to process and examine the massive data deluge from cameras, digital sensors and precision instruments of all kinds. This is the key to creating more accurate and realistic models that expose the hidden meaning of data, which gives us the kind of insight we need to solve a myriad of challenges. We have made great strides in our ability to build these kinds of computer models, and yet they are still too difficult, expensive and time consuming to manage. Today, even the most complicated data-rich simulations cannot fully capture all of the intricacies and dependencies of the systems they are trying to model. That is why, across the scientific and engineering world, it is so hard to say with any certainty when or where the next volcano will erupt and what flight patterns it might affect, or to more accurately predict something like a global flu pandemic. So far, we just cannot collect, correlate and compute enough data to create an accurate forecast of the real world. But this is about to change. Innovations in technology are transforming our ability to measure, monitor and model how the world behaves. The implication for scientific research is profound, and it will transform the way we tackle global challenges like health care and climate change. It will also have a huge impact on engineering and business, delivering breakthroughs that could lead to the creation of new products, new businesses and even new industries. Because you are a subscriber to executive e-mails from Microsoft, I want you to be the first to know about a new effort focused specifically on empowering millions of the world's smartest problem solvers. Today, I am happy to introduce Microsoft's Technical Computing initiative. Our goal is to unleash the power of pervasive, accurate, real-time modeling to help people and organizations achieve their objectives and realize their potential. We are bringing together some of the brightest minds in the technical computing community across industry, academia and science at www.modelingtheworld.com to discuss trends, challenges and shared opportunities. New advances provide the foundation for tools and applications that will make technical computing more affordable and accessible where mathematical and computational principles are applied to solve practical problems. One day soon, complicated tasks like building a sophisticated computer model that would typically take a team of advanced software programmers months to build and days to run, will be accomplished in a single afternoon by a scientist, engineer or analyst working at the PC on their desktop. And as technology continues to advance, these models will become more complete and accurate in the way they represent the world. This will speed our ability to test new ideas, improve processes and advance our understanding of systems. Our technical computing initiative reflects the best of Microsoft's heritage. Ever since Bill Gates articulated the then far-fetched vision of "a computer on every desktop" in the early 1980's, Microsoft has been at the forefront of expanding the power and reach of computing to benefit the world. As someone who worked closely with Bill for many years at Microsoft, I am happy to share with you that the passion behind that vision is fully alive at Microsoft and is carried out in the creation of our new Technical Computing group. Enabling more people to make better predictions We have seen the impact of making greater computing power more available firsthand through our investments in high performance computing (HPC) over the past five years. Scientists, engineers and analysts in organizations of all sizes and sectors are finding that using distributed computational power creates societal impact, fuels scientific breakthroughs and delivers competitive advantages. For example, we have seen remarkable results from some of our current customers: Malaria strikes 300,000 to 500,000 people around the world each year. To help in the effort to eradicate malaria worldwide, scientists at Intellectual Ventures use software that simulates how the disease spreads and would respond to prevention and control methods, such as vaccines and the use of bed nets. Technical computing allows researchers to model more detailed parameters for more accurate results and receive those results in less than an hour, rather than waiting a full day. Aerospace engineering firm, a.i. solutions, Inc., needed a more powerful computing platform to keep up with the increasingly complex computational needs of its customers: NASA, the Department of Defense and other government agencies planning space flights. To meet that need, it adopted technical computing. Now, a.i. solutions can produce detailed predictions and analysis of the flight dynamics of a given spacecraft, from optimal launch times and orbit determination to attitude control and navigation, up to eight times faster. This enables them to avoid mistakes in any areas that can cause a space mission to fail and potentially result in the loss of life and millions of dollars. Western & Southern Financial Group faced the challenge of running ever larger and more complex actuarial models as its number of policyholders and products grew and regulatory requirements changed. The company chose an actuarial solution that runs on technical computing technology. The solution is easy for the company's IT staff to manage and adjust to meet business needs. The new solution helps the company reduce modeling time by up to 99 percent - letting the team fine-tune its models for more accurate product pricing and financial projections. Our Technical Computing direction Collaborating closely with partners across industry and academia, we must now extend the reach of technical computing even further to help predictive modelers and data explorers make faster, more accurate predictions. As we build the Technical Computing initiative, we will invest in three core areas: Technical computing to the cloud: Microsoft will play a leading role in bringing technical computing power to scientists, engineers and analysts through the cloud. Existing high- performance computing users will benefit from the ability to augment their on-premises systems with cloud resources that enable 'just-in-time' processing. This platform will help ensure processing resources are available whenever they are needed-reliably, consistently and quickly. Simplify parallel development: Today, computers are shipping with more processing power than ever, including multiple cores, but most modern software only uses a small amount of the available processing power. Parallel programs are extremely difficult to write, test and trouble shoot. However, a consistent model for parallel programming can help more developers unlock the tremendous power in today's modern computers and enable a new generation of technical computing. We are delivering new tools to automate and simplify writing software through parallel processing from the desktop... to the cluster... to the cloud. Develop powerful new technical computing tools and applications: We know scientists, engineers and analysts are pushing common tools (i.e., spreadsheets and databases) to the limits with complex, data-intensive models. They need easy access to more computing power and simplified tools to increase the speed of their work. We are building a platform to do this. Our development efforts will yield new, easy-to-use tools and applications that automate data acquisition, modeling, simulation, visualization, workflow and collaboration. This will allow them to spend more time on their work and less time wrestling with complicated technology. Thinking bigger There is so much left to be discovered and so many questions yet to be answered in the fascinating world around us. We believe the technical computing community will show us that we have not seen anything yet. Imagine just some of the breakthroughs this community could make possible: Better predictions to help improve the understanding of pandemics, contagion and global health trends. Climate change models that predict environmental, economic and human impact, accessible in real-time during key discussions and debates. More accurate prediction of natural disasters and their impact to develop more effective emergency response plans. With an ambitious charter in hand, this new team is ready to build on our progress to-date and execute Microsoft's technical computing vision over the months and years ahead. We will steadily invest in the right technologies, tools and talent, and work to bring together the technical computing community. I invite you to visit www.modelingtheworld.com today. We welcome your ideas and feedback. I look forward to making this journey with you and others who want to answer the world's biggest questions, discover solutions to problems that seem impossible and uncover a host of new opportunities to change the world we live in for the better. Bob

    Read the article

  • One eye on my dinner and one eye on SQL server

    - by fatherjack
    LiveJournal Tags: RedGate,Work Life Balance,Tips and Tricks,SQL Server This is somewhere between a Tweet and a proper blog article - would that be a Bleet? Anyway, I was at a local restaurant yesterday and after placing my order I was thinking about having to get home and log in to check some SQL Servers and then the thought came to me that as we were near civilisation there was likely to be a 3G signal that might actually make using the web browser on my phone bearable. It was surprisingly fast on my HTC Desire, it was almost as good as Wi-Fi. RedGate SQL Monitor works fine on the default HTC browser and here is the proof, me checking the servers while I am waiting for the meal to arrive. Everything checked out OK so I had the evening free from SQL Server. You can get a free 14 day full trial of a SQL Monitor from RedGate here or find out more about it at The Future of Monitoring. Disclosure: I am a friend of RedGate and as such regularly make positive comments about their products. I don't get paid for it but I do get free licenses for testing and reviewing purposes.

    Read the article

  • SkyDrive and Consumer Cloud Services

    - by Tim Murphy
    Paul Thurrrott recently posted an article on the future of SkyDrive and I was asked what I thought about its future by @UserCommunity.  So let’s take a look. The breakdown from Microsoft that Paul described I believe is an accurate representation of users and usages. While I can’t say that I leverage SkyDrive to the extent that it was meant to be I do enjoy having OneNote hosted their and being able to consult and edit it from the desktop, web and Windows Phone. Taking that one step further is the Midwest Geeks group which started as the community of Microsoft related user groups in our region uses SkyDrive groups and shares calendars and documents.  This collaboration aspect isn’t new in itself, but having it connected with the rest of your cloud assets makes life easier. Another recent usage of this type of cloud service is storing your personal music files in order to get that same universal access.  This is a scenario that has some arguments for and against.  On the one hand own once and listen anywhere is great, but the on the other hand the bandwidth cost becomes a giant downside.  This is especially the case since most carriers are now doing away with unlimited data packages. Ultimately I see this type of resource growing an evolving at a phenomenal rate over the next few years as we continue to become more mobile.  Having multiple players such as SkyDrive and iCloud will only help to give us more options.  Only time will tell where we end up next. del.icio.us Tags: SkyDrive,Cloud Services,Paul Thurrott,UserCommunity

    Read the article

  • Macedonian Code Camp 2011

    - by hajan
    Autumn was filled with lot of conferences, events, speaking engagements and many interesting happenings in Skopje, Macedonia. First at October 20, I was speaking at Microsoft Vizija 9 on topic ASP.NET MVC3 and Razor. One week ago, November 15 I was speaking for first time on topic not related to web development (but still deployment of web apps was part of the demos) on topic “Cloud Computing – Windows Azure” at Microsoft BizSpark Bootcamp. The next event, which is the biggest event by the number of visitors and number of tracks is the Code Camp 2011 event. After we opened the registrations for the event, we sold out (free) 600 tickets in the first 15 hours! We all got astonished by the extremely big number of responses we’ve got… In this event, I can freely say that we expect about 700 attendees to come, and we already have 900+ registered. The event will be held at Saturday, 26 November 2011. At Code Camp 2011, I will speak on topic ASP.NET MVC Best Practices. There are many interesting things to say on this presentation, I will mainly focus on Tips, Tricks, Guidelines and other Practices that I have been using in real-life projects developed by using ASP.NET MVC Framework, with special focus on ASP.NET MVC3 and the next release, ASP.NET MVC4 Developer Preview. There are big number of known local and regional speakers, including 7 MVPs. You can find more info about this event at the official event website: http://codecamp.mkdot.net As for my session, if you have some interesting trick or good practice you have been using in your ASP.NET MVC projects, you can freely share it with me… If I find it interesting and if it’s not part of the current practices I have included for the presentation (I can’t tell you which ones for now… *secret* ;))… I will consider including it in the presentation. Stay tuned for more info soon… Regards, Hajan

    Read the article

  • TechEd 2012: Day 3 &ndash; Build Me A Solution

    - by Tim Murphy
    While digesting my lunch it was time to digest some TFS Build information. While much of my time is spent wearing my developer’s hat I am still a jack of all trades and automated builds are an important aspect of any project.  Because of this I was looking forward to finding out what new features are available in the latest release of Team Foundation Server. The first feature that caught my attention is the TFS Admin Client.  After being used to dealing with NAnt in the past it is nice to see a build a configuration GUI that is so flexible and well thought out.  The bonus is that it the tools that are incorporated in Visual Studio 2012 are just as feature rich.  Life is good. Since automated builds are the hub of your development process in a continuous integration shop I was really interested in the process related options. The biggest value add that I noticed was merge gated check-ins.  Merge or batch gated check-ins are an interesting concept.  If the build breaks with all the changes then TFS will run separate builds for each of the check-ins.  This ability to identify the actual offending check-in can save a lot of time and gray hair. The safari of TFS Build that was this session was packed with attractions.  How do you set it up builds, what are the different flavors of builds, how does the system report how the build went?  I would suggest anyone who is responsible for build automation spend some serious time with TFS 2012 and VS2012. del.icio.us Tags: Team Foundation Server 2012,TFS,Build,TechEd,TechEd 2012,Visual Studio 2012

    Read the article

  • Frühjahrshochschule für Studentinnen und interessierte Frauen der Fachgebieten Maschinenbau und Elektrotechnik vom 23.2.-27.2.2011

    - by britta.wolf
    Nach dem erfolgreichen Start im Frühjahr 2010 am Campus Villingen-Schwenningen der Hochschule Furtwangen wird nun die 2. meccanica feminale, die Weiterbildungsplattform für Ingenieurinnen und Studentinnen aus den Fachbereichen Maschinenbau und Elektrotechnik, vom 23.-27.02.2011 in Kooperation mit der Universität Stuttgart auf dem Campus Vaihingen ausgerichtet.Für alle interessierten Frauen bietet die meccanica feminale Workshops, Seminare und Vorträge auf hohem wissenschaftlichem Niveau an: Kurse wie z.B. "Einführung in MATLAB", "Strömungssimulation", "Werkstoffe für Mikro- und Nanotechnik" bieten interessante Möglichkeiten zur fachlichen Weiterbildung. Aus dem Bereich Soft Skills werden Kurse wie z.B. "Work-Life-Balance für Studium und Beruf", "Selbstmarketing" und "Entscheidungskompetenz" angeboten. Für den abschließenden Netzwerkabend am 26.02.2011 konnte als Referentin Frau Dr. Kira Stein, Trägerin des Bundesverdienstkreuzes, mit ihrem Vortrag "Moderne Anforderungsprofile - Weibliche Stärken auf den Punkt gebracht" gewonnen werden. Das Programm können Sie auf www.meccanica-feminale.de abrufen. Dort ist auch die Onlinebuchung der einzelnen Kurse möglich. Für weitere Information steht Ihnen gerne Frau Dr. Tanja Sieber persönlich (Telefon 07720 / 307 4260) oder per E-Mail (meccanica@hsfurtwangen. de) zur Verfügung.

    Read the article

  • Building in Change: Project Construction in Asset Intensive Industries

    - by Sylvie MacKenzie, PMP
    According to a recent survey by the Economist Intelligence Unit, sponsored by Oracle, only 51% of project owners rated themselves as effective at delivering their projects to scope, budget, and schedule when confronted with change. In addition only 43% rated themselves as effective at anticipating potential change. Even with the best processes and technology in place, change is often an unavoidable part of the construction process. How organizations respond to change can mean the difference between delays and cost overruns, and projects being completed on schedule and on budget. Implementing Enterprise Project Portfolio Management and using a solution to help manage and automate those process can help asset intensive organizations: Govern project and program compliance and regulatory requirements for project success Unite project teams and stakeholders through collaboration and strong feedback methods to speed project completion Reduce the risk of cost and schedule overruns and any resulting penalties to deliver on time and on budget Effectively manage change throughout the project life cycle Ensure sufficient capacity, utilization, and availability of people, skills, and other resources to meet commitments. The results of the recent EIU survey, sponsored by Oracle:"Building in Change: Project Construction in Asset-Intensive Industries", will be revealed in an upcoming webinar with Hart Energy / Oil & Gas Investor, featuring the Economist Intelligence Unit and Oracle on April 11th at 1pm CST. Click here for further information or visit http://www.oilandgasinvestor.com/

    Read the article

  • How much Ruby should I learn before moving to Rails?

    - by Kevin
    Just a quick question.. I can never get a definitive answer when googling this, either. Some people say you can learn Rails without knowing any Ruby, but at some point you'll run into a brick wall and wish you knew Ruby and will have to go back to learn it..and some say to learn the "basics" of Ruby before learning Rails and it will make your life that much easier.. My current knowledge is low. I'm not a beginner, but I'm not pro, either. I went through the Learn Python The Hard Way online book in about a month, but I stopped once I got to the OOP side of Python (I know booleans, elif/if/else/statements, for loops, while loops, functions) I agree with learning the "basics" of Ruby before learning Rails, but what exactly are the "basics" of Ruby? Would I need to learn the whole OOP side of Ruby before I went on to Rails? Or would I just need to learn the Ruby syntax up to where I learned Python (booleans, elif/if/else/statements, for loops, while loops, functions) before I went on to Rails? Thanks!

    Read the article

  • Can defect containment metrics be readily applied at an organizational level when there is only a consistant organizational process framework?

    - by Thomas Owens
    Defect containment metrics, such as total defect containment effectiveness (TDCE) and phase containment effectiveness (PCE), can be used to give a good indicator of the quality of the process. TDCE captures the defects that are captured at some point between requirements and the release of a product into the field, indicating the overall effectiveness of the entire process to find and remove defects. PCE provides more detail at each phase of the software development life cycle and how the defect detection and removal techniques are working. Applying these metrics makes sense at a level where you have a well-defined process and methodology for product development, often a project. However, some organizations provide a process framework that is tailored at the project level. This process framework would include the necessary guidance for meeting certifications (ISO9001, CMMI), practices for incorporating known good techniques (agile methods, Lean, Six Sigma), and requirements for legal or regulatory reasons. However, the specific details of how to gather requirements, design the system, produce the software, conduct test, and release are left to the product development teams. Is there any effective way to apply defect containment metrics at an organizational level when only a process framework exists at the organizational level? If not, what might be some ideas for metrics that can be distilled from each project (each using a tailored process that fits into the organizational process framework) that captures defect containment metrics to discuss the ability of the process to find and remove defects? The end goal of such a metric would be to consolidate the defect containment practices of a large number of ongoing projects and report to management. The target audience would be people in roles such as the chief software engineer and the chief engineer (of all engineering disciplines) for the organization. Although project specific data would be available, the idea is to produce something that quantifies the general effectiveness of all tailored processes across all ongoing projects. I would suspect that this data would also be presented as part of CMMI, ISO, or similar audits to demonstrate process quality.

    Read the article

  • Manager keeps changing requirement specification after every demo.

    - by Jungle Hunter
    Background of my working environment My manager has no background or understanding of computers or software whatsoever. It is highly likely he hasn't seen code in any form (not even from a physical distance of 10 feet or less) in his life. There is no one who understands the complexity of what I am asked to implement. To the point that if I semi-hardcode no one would know. On Joel's test we score an unbelievable score 0. The problems The manager and at times other "senior" keep changing the requirement specification. Changes which, if good engineering be done and not patchy "fixes", require change in the underlying design. There is absolutely no one who looks at code (probably because no one knows how to, or even if it should be done) which means no one will ever be able to: Appreciate the complexity of the problem or the elegance of the solution. Suggest improvement to the approach. Appreciate the quality of the code. Point out where the code can be improved. A lot of jargon is used which makes sense grammatically but fails to make any sense any other way. Doesn't feel, behave or work like a software company. The question What should be done? Especially regarding there being no one who would point out improvements in my code. Update To answer HLGEM's (and possibly others) question about what I've done to try and fix it. I offered to set up Redmine and introduce source control to everyone. I said I would recommend distributed (git or mercurial) but will also talk about centralized ones and let the team decide. Response was that things are being done and will be done within weeks. Haven't seen that nor am I aware if other parts of the company use it.

    Read the article

  • Final Man vs. Machine Round of Jeopardy Unfolds; Watson Dominates

    - by ETC
    The final round of IBM’s Watson against Ken Jenning and Brad Rutter ended last night with Watson coming out in a strong lead against its two human opponents. Read on to catch a video of the match and see just how quick Watson is on the draw. Watson tore through many of the answers, the little probability bar at the bottom of the screen denoting it was often 95%+ confident in its answers. Some of the more interesting stumbles were, like in the last matches, based on nuance. By far the biggest “What?” moment of the night, however, was when it answered the Daily Double question of “The New Yorker’s 1959 review of this said in its brevity and clarity, it is ‘unlike most such manuals, a book as well as a tool’”. Watson, inexplicably, answered “Dorothy Parker”. You can win them all, eh? Check out the video below to see Watson in action on its final day. Latest Features How-To Geek ETC How to Enable User-Specific Wireless Networks in Windows 7 How to Use Google Chrome as Your Default PDF Reader (the Easy Way) How To Remove People and Objects From Photographs In Photoshop Ask How-To Geek: How Can I Monitor My Bandwidth Usage? Internet Explorer 9 RC Now Available: Here’s the Most Interesting New Stuff Here’s a Super Simple Trick to Defeating Fake Anti-Virus Malware The Citroen GT – An Awesome Video Game Car Brought to Life [Video] Final Man vs. Machine Round of Jeopardy Unfolds; Watson Dominates Give Chromium-Based Browser Desktop Notifications a Native System Look in Ubuntu Chrome Time Track Is a Simple Task Time Tracker Google Sky Map Turns Your Android Phone into a Digital Telescope Walking Through a Seaside Village Wallpaper

    Read the article

  • links for 2010-03-31

    - by Bob Rhubart
    Andy Mulholland: Rethinking the narrow and deep expertise model "We increasingly realise that we have to read requirements in a more open way to decide what techniques can be used, what business experience can be added, etc, so the whole idea of encouraging ‘cross’ discipline understanding seems to look increasingly necessary as we look at how technology touches every part of business, and/or any other aspect of life. It is time to rethink the narrow and deep expertise model and consider T-shaped approaches where the depth is complimented by the width to understand how it might be used and how it fits with other capabilities and disciplines too." -- Andy Mulholland (tags: enterprisearchitecture) @vambenepe: Smoothing a discrete world "For the short term (until we sell one) there are three cars in my household. A manual transmission, an automatic and a CVT (continuous variable transmission). This makes me uniquely qualified to write about Cloud Computing." -- William Vambenepe (tags: otn oracle cloud) @fteter: The Price of Progress "I wonder about the price of progress on the business world. Do some of us get attached to old business models or software applications? Do we resist change for the better for emotional reasons? Are we sometimes impediments to progress just because we don't want things to change?" -- Oracle ACE Director Floyd Teter (tags: otn oracle oracleace progress innovation) Pat Shepherd: Enterprise Architecture should not be Arbitrary "If done properly the Business, Application and Information architectures are nailed down BEFORE any technological direction (SOA or otherwise) is set. Those 3 layers and Governance (people and processes), IMHO, are layers that should not vary much as they have everything to do with understanding the business -- from which technological conclusions can later be drawn." - Pat Shepherd, responding to a post by Jordan Braunstein. (tags: oracle otn enterprisearchitecture soa)

    Read the article

  • SQLAuthority News – Story of Seattle – SQLPASS 2011 Event Log

    - by pinaldave
    Just like every year I attended SQL PASS in Seattle earlier this month. The event was scheduled from Oct 11-14, 2011 in the convention center of the Seattle. I have been to Seattle more than 6 times so far so it is not a new city for me anymore. The city has always impressed me with its vibrant life and pleasant weather. Just like every other time, I had excellent experience once again in the city. Though I just arrived on the day of the event and left right after the event was over – I hardly visited Seattle – still some good experience to share. Here are few quick photographs from my quick trip of Seattle city. Skyline of Seattle Seattle Convention Center A Shop Tenzing Momo and Co at Pike St Market The Seattle Gum Wall Shoreline in Seattle Nigel and Paras First Starbucks (Relocated) People on Street of Seattle Food at Sandy’s – All Veg Well, this is a short summary of my extremely quick city tour of Seattle. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL PASS, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, T SQL, Technology

    Read the article

  • Oracle Accelerate : Packaged CX Solutions for Growing Companies

    - by Richard Lefebvre
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Oracle Accelerate is Oracle's approach for providing simple to deploy, packaged, enterprise-class software solutions to growing midsize organizations through its network of expert partners. They come with a fixed price, a fixed scope and can be industry- or country-specific. Here is a suggestion of Oracle Accelerate solutions specially tailored for EMEA based customers looking for growing their business with CX technology: Oracle Sales Cloud Birchman Consulting's Oracle Accelerate Solution for Oracle Sales Cloud CSolutor Oracle Accelerate Solution for Oracle Sales Cloud CapricornVentis Oracle Accelerate Solution for Oracle Sales Cloud Oracle Sales Cloud for vertical industries Enigen’s Oracle Accelerate solution for Oracle Fusion CRM for Professional Services BPI's Oracle Accelerate solution for Oracle Sales Cloud for Business Services Companies BPI's Oracle Accelerate Solution for Oracle Sales Cloud for Insurance Companies BPI's Oracle Accelerate solution for Oracle Sales Cloud for Engineering & Construction Companies BPI's Oracle Accelerate Solution for Oracle Sales Cloud for Telecommunications Companies Fellow Consulting's Oracle Accelerate Solution for Oracle Sales Cloud for Consumer Goods industry Fellow Consulting's Oracle Accelerate Solution for Oracle Sales Cloud for Wholesale Distribution Fellow Consulting's Oracle Accelerate Solution for Oracle Sales Cloud for Life Science industry Oracle Service Cloud (RightNow) CapricornVentis Oracle Accelerate Solution for Oracle RightNow Cloud Service for Retail Industry for Ireland CapricornVentis Oracle Accelerate Solution for Oracle RightNow Cloud Service for Retail Industry for the United Kingdom Enigen’s Oracle Accelerate Solution for Oracle RightNow Service Cloud for the United Kingdom DNASTREAM’s RapidLaunch Oracle Accelerate solution for RightNow Oracle Commerce (ATG) ProgiCommerce - an Oracle Accelerate solution for ATG Commerce delivered by PROGIWEB Spindrift Momentum - an Oracle Accelerate Solution for ATG Commerce for Retail Industry e2x RoadRunner - the ATG Oracle Accelerate solution for Manufacturing Industry e2x RoadRunner - the ATG Oracle Accelerate solution for Telecommunications Industry e2x RoadRunner - the ATG Oracle Accelerate solution for Retail Web Commerce

    Read the article

  • If not gamedev, what do I do !

    - by brainydexter
    Hi, I am a game dev who was working in the game-industry and then..got laid off. Ever since then, life couldn't get less stressful! During this time, I have met so many other devs who have also been laid off irrespective of the number of years they have been in the game. Now, the problem really gets worse, since I am not a US citizen (yes, I am in US) , and am on an international visa here, I might have to soon pack my bags and go back to my native country. Going back is not bad at all, apart from the fact, that gamedev is still in a very nascent stage there. There just aren't many opportunities. So, employment is the key to maintain a valid visa status. After giving it a lot of thought, I am thinking of staying away from gamedev jobs for the time being, given its job unstablity. This brings me to my current problem. I can't think of a domain/place where I can use my game development skills. I know graphics/simulation/visualization is huge, but I can't think straight and am left clueless where to go from here. What are some of the domains/companies where I can use my skills ? I'd appreciate any insight on this (and I apologize if this is not the place to post this kind of a question).

    Read the article

  • What have you learnt that has a steep learning curve?

    - by Jonathan Khoo
    Recently, I've invested time in learning the intricacies of Git and it has got me thinking about time and learning. (My previous experience with version control systems was only limited use of CVS and SVN.) It took me a whole day's worth of reading to be able to understand the concepts and differences of Git. There are an infinite number of things available for us to learn. Some, more useful than others. I don't know Fortran - I'm relatively young. But looking back at the preceding years of my life, I notice that I'm busier and busier as time goes on. The amount of things I have to get through in a day is increasingly out of my control. It doesn't take a genius to extrapolate that information and realise I'll have even less time in the future - unless I get fired, but I have no strong plans relating to that idea for now. So, given that I have much more time and energy now than I will have in the future: what have you learnt, that has a steep learning curve, that you would possibly recommend to a fellow programmer? Edit: I've stumbled upon the excellent question What programming skills have provided you the best return on investment? and hav realised that my way of approaching how to spend learning time was naive - it doesn't matter if ten useful concepts can be learnt in the time of one if they're worth it.

    Read the article

< Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >