Search Results

Search found 6972 results on 279 pages for 'sharepoint 2007'.

Page 233/279 | < Previous Page | 229 230 231 232 233 234 235 236 237 238 239 240  | Next Page >

  • St. Louis ALT.NET

    - by Brian Schroer
    I’m a huge fan of the St. Louis .NET User Group and a regular attendee of their meetings, but always wished there was a local group that discussed more advanced .NET topics. (That’s not a criticism of the group - I appreciate that they want to server developers with a broad range of skill levels). That’s why I was thrilled when Nicholas Cloud started a St. Louis ALT.NET group in 2010. Here’s the “about us” statement from the group’s web site: The ALT.NET community is a loosely coupled, highly cohesive group of like-minded individuals who believe that the best developers do not align themselves with platforms and languages, but with principles and ideas. In 2007, David Laribee created the term "ALT.NET" to explain this "alternative" view of the Microsoft development universe--a view that challenged the "Microsoft-only" approach to software development. He distilled his thoughts into four key developer characteristics which form the basis of the ALT.NET philosophy: You're the type of developer who uses what works while keeping an eye out for a better way. You reach outside the mainstream to adopt the best of any community: Open Source, Agile, Java, Ruby, etc. You're not content with the status quo. Things can always be better expressed, more elegant and simple, more mutable, higher quality, etc. You know tools are great, but they only take you so far. It's the principles and knowledge that really matter. The best tools are those that embed the knowledge and encourage the principles (e.g. Resharper.) The St. Louis ALT.NET meetup group is a place where .NET developers can learn, share, and critique approaches to software development on the .NET stack. We cater to the highest common denominator, not the lowest, and want to help all St. Louis .NET developers achieve a superior level of software craftsmanship. I don’t see a lot of ALT.NET talk in blogs these days. The movement was harmed early on by the negative attitudes of some of its early leaders, including jerk moves like the Entity Framework “vote of no confidence”, but I do see occasional mentions of local groups like the St. Louis one. I think ALT.NET has been successful at bringing some of its ideas into the .NET world, including heavily influencing ASP.NET MVC and raising the general level of software craftsmanship for developers working on the Microsoft stack. The ideas and ideals live on, they’re just not branded as “this is ALT.NET!” In the past 18 months, St. Louis ALT.NET meetups have discussed topics like: NHibernate F# and other functional languages AOP CoffeeScript “How Ruby Is Making Me a Stronger C# Developer” Using rake for builds CQRS .NET dynamic programming micro web frameworks – Nancy & Jessica Git ALT.NET doesn’t mean (to me, anyway) “alternatives to .NET”, but “alternatives for .NET”. We look at how things are done in Ruby and other languages/platforms, but always with the idea “What can I learn from this to take back to my “day job” with .NET?”. Meetings are held at 7PM on the fourth Wednesday of each month at the offices of Professional Employment Group. PEG is located at 999 Executive Parkway (Suite 100 – lower level) in Creve Coeur (South of Olive off of Mason Road - Here's a map). Food is not supplied (sorry if you’re a big fan of the Papa John’s Crust-Lovers’ Pizza that’s a staple of user group meetings), but attendees are encouraged to come early and bring/share beer, so that’s cool. Thanks to Nick for organizing, and to Professional Employment Group for lending their offices. Please visit the meetup site for more information.

    Read the article

  • Pondering New Technology

    - by MOSSLover
    So I have been standing at the end of a fork for a year peering down a corner looking at this way and that way trying to figure out where I fit in.  I was so enthusiastic and excited about Silverlight when it came out.  It was this amazing awesome technology that had this really cool animation and webcam/multimedia piece.  I thought if I put my money on Silverlight it’s going somewhere, then HTML 5 came out and the wind shifted.  I realized times were changing. I have been working with web technologies since I was 15 years old.  Playing with html and javascript and even css back when it first came out.  In tech years 15 years is forever.  Things change so quickly and so often.  So I guess the question is where am I heading?  The answer is mobile technology.  For some reason I was resisting change and I have no idea why.  I guess I really wanted to see more than one player.  I didn’t quite feel that Microsoft was ready with Windows Phone 7.  It was a great start, but it just didn’t feel like they went all the way.  Now with Windows 8 it feels like they are at version 2.0.  It’s like hitting Silverlight 2.0 where they finally added the .Net bits.  The path is paved, but we don’t know where it’s leading.  Then we had 3.0, 4.0, and 5.0 to mature the technology into what it is today (man I’m hoping they are going to roll some of the cool bits into other tech if they don’t exist). Anyway, I’m on board, but I’m not buying a Windows Tablet just yet.  I was hoping for a 7 inch screen from Apple around $300 or just above and a 7 inch screen from the MS side around the same price.  What I got was the Apple side, but nothing from Microsoft.  I was pretty disappointed with the $500 market price on the RT version.  I realized Microsoft is close, but not quite where Apple is today.  Yes the devices have Office that they are offering, but the sticker is just too much for a first generation device.  If you guys remember correctly the first generation iPad was quite expensive.  I guess for a 1st generation device $500 is pretty good. So I guess what I’m trying to say is that I am shifting my focus entirely away from Silverlight and more towards mobile.  I will be doing a lot of postings on iOS, Windows 8, and Windows Phone 8 with SharePoint 2013.  Since I don’t have a tablet and don’t foresee myself buying one just yet it might be mostly on the phones for right now.  I want to do a bunch of testing on various devices on what needs to be done in apps on each device.  I might add a bit on porting code from one to the other.  I think it’s going to be a lot of fun and make things flow a little better for me.  In a way it’s kind of like Star Trek 6 where they talk about the Undiscovered Country.  I’m going to jump forward completely and see where I land. Technorati Tags: SharePoint 2013,Mobile,Windows 8,iOS

    Read the article

  • Templated Razor Delegates – Phil Haack

    - by nmarun
    This post is largely based off of Phil Haack’s article titled Templated Razor Delegates. I strongly recommend reading this article first. Here’s a sample code for the same, so you can have a look at. I also have a custom type being rendered as a table. 1: // my custom type 2: public class Device 3: { 4: public int Id { get; set; } 5: public string Name { get; set; } 6: public DateTime MfgDate { get; set; } 7: } Now I can write an extension method just for this type. 1: public static class RazorExtensions 2: { 3: public static HelperResult List(this IList<Models.Device> devices, Func<Models.Device, HelperResult> template) 4: { 5: return new HelperResult(writer => 6: { 7: foreach (var device in devices) 8: { 9: template(device).WriteTo(writer); 10: } 11: }); 12: } 13: // ... 14: } Modified my view to make it a strongly typed one and included html to render my custom type collection in a table. 1: @using TemplatedRazorDelegates 2: @model System.Collections.Generic.IList<TemplatedRazorDelegates.Models.Device> 3:  4: @{ 5: ViewBag.Title = "Home Page"; 6: } 7:  8: <h2>@ViewBag.Message</h2> 9:  10: @{ 11: var items = new[] { "one", "two", "three" }; 12: IList<int> ints = new List<int> { 1, 2, 3 }; 13: } 14:  15: <ul> 16: @items.List(@<li>@item</li>) 17: </ul> 18: <ul> 19: @ints.List(@<li>@item</li>) 20: </ul> 21:  22: <table> 23: <tr><th>Id</th><th>Name</th><th>Mfg Date</th></tr> 24: @Model.List(@<tr><td>@item.Id</td><td>@item.Name</td><td>@item.MfgDate.ToShortDateString()</td></tr>) 25: </table> We get intellisense as well! Just added some items in the action method of the controller: 1: public ActionResult Index() 2: { 3: ViewBag.Message = "Welcome to ASP.NET MVC!"; 4: IList<Device> devices = new List<Device> 5: { 6: new Device {Id = 1, Name = "abc", MfgDate = new DateTime(2001, 10, 19)}, 7: new Device {Id = 2, Name = "def", MfgDate = new DateTime(2011, 1, 1)}, 8: new Device {Id = 3, Name = "ghi", MfgDate = new DateTime(2003, 3, 15)}, 9: new Device {Id = 4, Name = "jkl", MfgDate = new DateTime(2007, 6, 6)} 10: }; 11: return View(devices); 12: } Running this I get the output as: Absolutely brilliant! Thanks to both Phil Haack and to David Fowler for bringing this out to us. Download the code for this from here. Verdict: RazorViewEngine.Points += 1;

    Read the article

  • Praise for Europe's Smart Metering & Conservation Efforts

    - by caroline.yu
    Recently, a writer at the Home Energy Team praised the UK for its efforts towards smart metering and energy conservation, with an article entitled UK Blazing A Trail With Smart Metering At Home? The article highlighted that the Department of Energy and Climate Change has announced that smart metering will be introduced in the next decade and that all UK households will have smart meters by the year 2020. In fact, the UK is not the only country striving to achieve carbon reduction targets, as many of its European counterparts have begun to take positive steps towards tackling the issue of energy conservation by implementing innovative new metering and billing technologies as well as promoting alternative energy solutions, such as wind and solar power. Since 1997, the states of the European Union, including France, Germany and Spain, have been working towards achieving a target of 12 percent renewable energy electricity by 2010. Germany in particular has made a significant achievement so far, having surpassed the target early in 2007. This success is largely due to the German Renewable Energy Act (EEG), which promoted the use of renewable energy. Recently, analysis from the European Wind Energy Association (EWEA) found that 21 of the EU Member States are meeting or exceeding their national target to achieve 20 percent renewable energy by 2020. However, six states - Belgium, Italy, Luxembourg, Malta, Bulgaria and Denmark - say they will not manage to reach their target through domestic action alone. Bulgaria and Denmark believe that with fresh national initiatives they could meet or exceed their targets, but others, including Italy, may need to import renewable energy from neighboring non-EU countries. Top achievers, according to the EWEA report, are Spain, which believes its renewable energy will reach 22.7 percent by 2020, as well as Germany, Estonia, Greece, Ireland, Poland, Slovakia and Sweden, who will all exceed their targets. "Importantly, the way that this renewable energy is controlled and distributed must be addressed in order to ensure its success," said Bastian Fischer, vice president and general manager EMEA, Oracle Utilities. "A smart gird infrastructure can enable utilities to deal with load distribution in times of increased need and ensure power is always available from these means. A smart grid also underpins the success of metering and billing technologies, such as smart metering, and allows utilities to deal with increased usage data and provide accurate billing." Outside of Europe, Australia has made significant steps towards improving water conservation. The Australian Department of Sustainability and Environment took some of the recent advancements made in the energy sector, including new metering and billing solutions, and applied them to the water industry, enhancing customer service and reducing consumption as a result. The adoption of smart metering in Europe is mainly driven by regulation, but significant technological improvements are being made the world over to change the way we use all kinds of energy. However, the developing markets are lagging behind. One of the primary reasons for this is the lack of infrastructure in place to use as a foundation for setting up energy-saving solutions, which is slowing the adoption of technologies such as smart meters. However, these countries do benefit from fewer outdated infrastructure and legacy systems, which is often cited by others as a difficult barrier to deploying new solutions. As a result, some countries should find new technologies easier to implement and adapt to in the immediate future, without this roadblock.

    Read the article

  • LexisNexis and Oracle Join Forces to Prevent Fraud and Identity Abuse

    - by Tanu Sood
    Author: Mark Karlstrand About the Writer:Mark Karlstrand is a Senior Product Manager at Oracle focused on innovative security for enterprise web and mobile applications. Over the last sixteen years Mark has served as director in a number of tech startups before joining Oracle in 2007. Working with a team of talented architects and engineers Mark developed Oracle Adaptive Access Manager, a best of breed access security solution.The world’s top enterprise software company and the world leader in data driven solutions have teamed up to provide a new integrated security solution to prevent fraud and misuse of identities. LexisNexis Risk Solutions, a Gold level member of Oracle PartnerNetwork (OPN), today announced it has achieved Oracle Validated Integration of its Instant Authenticate product with Oracle Identity Management.Oracle provides the most complete Identity and Access Management platform. The only identity management provider to offer advanced capabilities including device fingerprinting, location intelligence, real-time risk analysis, context-aware authentication and authorization makes the Oracle offering unique in the industry. LexisNexis Risk Solutions provides the industry leading Instant Authenticate dynamic knowledge based authentication (KBA) service which offers customers a secure and cost effective means to authenticate new user or prove authentication for password resets, lockouts and such scenarios. Oracle and LexisNexis now offer an integrated solution that combines the power of the most advanced identity management platform and superior data driven user authentication to stop identity fraud in its tracks and, in turn, offer significant operational cost savings. The solution offers the ability to challenge users with dynamic knowledge based authentication based on the risk of an access request or transaction thereby offering an additional level to other authentication methods such as static challenge questions or one-time password when needed. For example, with Oracle Identity Management self-service, the forgotten password reset workflow utilizes advanced capabilities including device fingerprinting, location intelligence, risk analysis and one-time password (OTP) via short message service (SMS) to secure this sensitive flow. Even when a user has lost or misplaced his/her mobile phone and, therefore, cannot receive the SMS, the new integrated solution eliminates the need to contact the help desk. The Oracle Identity Management platform dynamically switches to use the LexisNexis Instant Authenticate service for authentication if the user is not able to authenticate via OTP. The advanced Oracle and LexisNexis integrated solution, thus, both improves user experience and saves money by avoiding unnecessary help desk calls. Oracle Identity and Access Management secures applications, Juniper SSL VPN and other web resources with a thoroughly modern layered and context-aware platform. Users don't gain access just because they happen to have a valid username and password. An enterprise utilizing the Oracle solution has the ability to predicate access based on the specific context of the current situation. The device, location, temporal data, and any number of other attributes are evaluated in real-time to determine the specific risk at that moment. If the risk is elevated a user can be challenged for additional authentication, refused access or allowed access with limited privileges. The LexisNexis Instant Authenticate dynamic KBA service plugs into the Oracle platform to provide an additional layer of security by validating a user's identity in high risk access or transactions. The large and varied pool of data the LexisNexis solution utilizes to quiz a user makes this challenge mechanism even more robust. This strong combination of Oracle and LexisNexis user authentication capabilities greatly mitigates the risk of exposing sensitive applications and services on the Internet which helps an enterprise grow their business with confidence.Resources:Press release: LexisNexis® Achieves Oracle Validated Integration with Oracle Identity Management Oracle Access Management (HTML)Oracle Adaptive Access Manager (pdf)

    Read the article

  • Apps UX Unveils New Face of Fusion at OpenWorld 2012

    - by Kathy.Miedema
    By Kathy Miedema, Oracle Applications User Experience The Oracle Applications User Experience (UX) team is getting ready to unveil the new face of Oracle Fusion Applications at Oracle OpenWorld 2012 in San Francisco next week. Photos by Martin Taylor, Oracle Applications User ExperienceJeremy Ashley, Vice President of Oracle Applications User Experience, shows the new face of Fusion Applications to a group of trainers at Oracle’s headquarters in Redwood Shores, Calif. Our team spent the past 6 months working on this project, which embraces simplicity with a modern, productive user experience that aims to help our applications customers rapidly scale deployment of essential self-service tasks and speed adoption by users who need quick access to do quick-entry tasks. We have spent the week before OpenWorld at Oracle headquarters in Redwood Shores, conducting training sessions with Fusion UX Advocates (FXA), Oracle UX Sales Ambassadors (SAMBA), and members of the Oracle Usability Advisory Board (OUAB). We showed the new face of Fusion to customers, partners, ACE Directors, and people from our own sales organization. Next week during OpenWorld, they will be showing demos alongside our team members. To find them, look for the Usable Apps t-shirt, with this artwork: You can also get a look at the new face of Fusion during OpenWorld at the following sessions and demopods: GEN9433 - General Session: Oracle Fusion Applications—Overview, Strategy, and Roadmap Presenter: Chris Leone, Senior Vice President, Oracle Monday, Oct. 1, 10:45 a.m. – 11:45 a.m. in Moscone West 2002/2004 AND Wednesday, Oct. 3, 10:1 a.m. – 11:15 a.m. in Moscone West 2002/2004 CON9407 - Oracle Fusion Customer Relationship Management: Overview/Strategy/Customer Experiences/Roadmap Presenter: Anthony Lye, Senior Vice President, Oracle Monday, Oct. 1, 3:15 – 4:15 p.m. in Moscone West 2008 CON9438 - Oracle Fusion Applications: Transforming Insight into Action Presenters: Jeremy Ashley, Vice President Applications User Experience, Oracle; Katie Candland, Director Applications User Experience, Oracle; Basheer Khan, founder and CEO of Innowave Technology, an Oracle ACE Director for both Fusion Middleware and Applications, and a Fusion UX Advocate Tuesday, Oct. 2, 10:15 a.m. - 11:15 a.m. in Moscone West 2007 CON9467 - Oracle’s Roadmap to a Simple, Modern User Experience Presenter: Jeremy Ashley, Vice President Applications User Experience, Oracle Wednesday, Oct. 3, 3:30 p.m. - 4:30 p.m. in Moscone West 3002/3004 On the demogrounds: Come to the Apps UX pods for a look at enterprise applications on mobile devices such as smart phones and the iPad, and stay for a demo of the new face of Oracle Fusion Applications. Our demopods will also feature some of the cutting-edge tools in Oracle’s arsenal of usability evaluation methods. The Exhibition Hall at Oracle OpenWorld 2012 will be open Monday through Wednesday, Oct. 1-3. The demogrounds for Oracle Applications are located on the lower level of Moscone West in San Francisco. Hours for the Exhibition Hall are: · Monday, 10 a.m. to 6 p.m. · Tuesday, 9:45 a.m. to 6 p.m. · Wednesday, 9:45 a.m. to 4 p.m.

    Read the article

  • NetBeans PHP Community Council

    - by Tomas Mysik
    Hi all, today we would like to inform all of you that now you have a chance to improve NetBeans via NetBeans PHP Community Council. The author of this activity is Timur Poperecinii and he would like to tell you a few words about it. Hello passionate technical people, First of all let me introduce myself: my name is Timur, I’m a developer from Moldova (that little country between Romania and Ukraine), I develop mostly in .NET and JQuery, but I love to learn more, not being an expert I am familiar with Java (Struts2, Play), PHP (Symfony2), Ruby (Rails), Sencha Touch 2 and other technologies. I was “introduced” in PHP recently by a client of mine who requested to make the work specifically in PHP. Let me tell you a little story about my experience with open source and IDEs: when I was studying in university in 2007 I think, I did a simple little application in PHP and thought “Damn, if only there was a good IDE for PHP so I could relax and no having to remember all the function names”, then when I searched on internet pretty much everyone was using Vim or Emacs on Linux, but it had no autocomplete anyway, just syntax highlighting. I remember using some tool like Notepad++ I think. Nowadays everything changed, we have highlighting and autocomplete for about all standard things in PHP in many IDEs. I use NetBeans for PHP, and I really am happy with the experience I have there with standard PHP code, but for frameworks I still think there is lots of room for improvements. For example we have some Symfony 2 and Twig support. But I’d love to see more of that coming, for example I’m a big fan of file templates, where the main goal is to not waste time on writing over and over again something that can be generated, and it counts even more when you don’t have a lot of autocomplete. So what I thought, “Hey I know Java a little, and NetBeans has plugins, so may be it worth trying to do a file templates plugin”, and so I did, you can find details about my Unified Udevi Symfony2 Plugin for NetBeans 7.2 on my blog. It wasn’t hard, and it even was fun! Give back to open source Now think a little, NetBeans is an open source project and PHP support is just a part of it, so the resources are pretty limited in this area. But we as a community that uses this product, want to have the best possible experience with PHP and frameworks(!!!). So why don’t we GIVE BACK TO OPENSOURCE ? Imagine an IDE that can do all the things you wanted + it is free. Now how far is NetBeans from that point? I guess not so far – you might miss a little niche thing that you use on a daily basis, but then the question appears why don’t you make it happen on your own? NetBeans PHP Community Council What I proposed is to create a NetBeans PHP Community Council that will be formed of people willing to change something, willing to create plugins for their own needs and for the needs of the community, test the plugins created by them too, and basically evolve NetBeans in direction they want to reach. I already talked with the NetBeans PHP team. They are only happy to help this Council, with technical advises, opening some APIs we might need to have access to, and other things. One important thing to mention is that this Council is a Community project, so though we’ll have direct discussions with NetBeans PHP Dev team, NetBeans is not the leading force here, it is the community. You can see more details about the goals and structure I proposed at NetBeans PHP Community Council wiki page. We use this mail list: [email protected] for discussions and topics related to the Council. How can I join To join the NetBeans PHP Community Council please send an email to [email protected] with the subject of the mail starting with [Council New Member]. You can subscribe to this mail list here:http://netbeans.org/projects/php/lists. in your mail please indicate your location, age and experience both in Java and PHP. I need these data to assign you to a team. A response will be send to you with your next assignment and some people to contact. I really hope that you’ll make a step forward and try to make your everyday use of NetBeans even more fun.

    Read the article

  • My Tech Ed North America Preview - Certification Edition

    - by Chris Gardner
    In my previous TechEd North America Preview, I addressed all the content I wanted to see at the show. This time, we shall turn our attention to the certifications I might try to pick up. If you have never been to TechEd North America before, one of the greatest things about the event is an on-site certification center. If you have a couple hours to spare, you can walk up to a test. The first test on my agenda is 70-5231. I took this update test once, but did not do well on the MVC portion2. A few practice tests later, and I think I'm ready to fake that section. After that, I need to complete my road to being a master. The good folks here at work have been having a real love / hate relationship with the idea of me become an MCM in SQL Server3. Of course, before I do that, I need to finally take the SQL Administration tests. Thus, we shall add 70-4324 and 70-4505 to the list. Speaking of MCM, TechEd North America will have a special on test 88-9706. This test is normally $500, and you have to find a place to take it7. However, there is a special 50% off rate for people who take it on location. With those kind of prices, I may just take it as a form of study guide. As a final push, I may take some Windows Phone exams. I mentioned in my previous post that I may attend the 70-5998 Exam Cram session. Unfortunately, I will be staffing the Hands-On-Lab at that time. As we know, this has never stopped me from taking a test. This may lead to fits of 70-5069, but after we've come this far... That should complete my list. Do I really think I'll find time to take 6 tests at TechEd North America? Probably not. I have done it at TechEd North America before, but that was before I was TechEd North America staff. I also had a co-worker pass 9 in one year, but he basically did nothing but travel to Orlando in 2007 to take tests. And what's the point of attending a HUGE conference if you don't network? Of course, networking will have to wait for Friday's post... 1 Upgrade: Transition Your MCPD .NET Framework 3.5 Web Developer Skills to MCPD .NET Framework 4 Web Developer 2Because I never have used, nor do I really think I ever will use, MVC... 3By that, I mean they love the idea, and they hate the price 4Microsoft SQL Server 2008, Implementation and Maintenance 5PRO: Designing, Optimizing and Maintaining a Database Administrative Solution Using Microsoft SQL Server 2008 6SQL Server 2008 Microsoft Certified Master: Knowledge Exam 7Which isn't nearly as expensive as the Lab Exam, nor as difficult to find a location. However, it is not offered at every testing facility. 8PRO: Designing and Developing Windows Phone Applications 9TS: Silverlight 4, Development

    Read the article

  • Silverlight Cream for February 06, 2011 -- #1042

    - by Dave Campbell
    In this Issue: Mike Taulty, Timmy Kokke, Laurent Bugnion, Arik Poznanski, Deyan Ginev, Deborah Kurata(-2-), Johnny Tordgeman, Roy Dallal, Jaime Rodriguez, Samuel Jack(-2-), James Ashley. Above the Fold: Silverlight: "Customizing Silverlight properties for Visual Designers" Timmy Kokke WP7: "Back button press when using webbrowser control in WP7" Jaime Rodriguez Expression Blend: "Blend Bits 21–Importing from Photoshop & Illustrator…" Mike Taulty From SilverlightCream.com: Blend Bits 21–Importing from Photoshop & Illustrator… Mike Taulty is up to 21 episodes on his Blend Bits sequence now, and this one is about using Blend's import capability, such as a .psd file with all the layers intact. Customizing Silverlight properties for Visual Designers Timmy Kokke has part 1 of 2 parts on making your Silverlight control properties in design surfaces such as Visual Studio designer or Expression Blend. An error when installing MVVM Light templates for VS10 Express Laurent Bugnion has released a new version of MVVMLight that resolves a problem with VS2010 Express version of the templates... no problem with anything else. Reading RSS items on Windows Phone 7 Arik Poznanski has a post up about reading RSS on a WP7, but better yet, he also has code for a helper class that you can grab, plus explanation of wiring it up. Integrating your Windows Phone unit tests with MSBuild #4: The WP7 Unit Test Application Deyan Ginev has a post up about Telerik's WP7 test app that outputs test results in XML from the emulator so they can be integrated with the MSBuild log. Accessing Data in a Silverlight Application: EF I apprently missed this post by Deborah Kurata last week on bringing data into your Silverlight app via Entity Frameworks... good detailed tutorial in VB and C#. Updating Data in a Silverlight Application: EF In Deborah Kurata's latest post, she is continuing with Entity Frameworks by demonstrating updating to the database... full source code will be produced in a later post. Fun with Silverlight and SharePoint 2010 Ribbon Control - Part 2 - An In Depth Look At The Ribbon Control Johnny Tordgeman has Part 2 of his Silverlight and Sharepoint 2010 Ribbon up... taking a deep-dive into the ribbon... great explanation of the attributes, code included. Geographic Coordinates Systems Roy Dallal has some Geo code up that's not necessarily Silverlight, but very cool if you're doing any GIS programming... ya gotta know the coordinate systems! Back button press when using webbrowser control in WP7 Jaime Rodriguez has a post up discussing the much-lamented back-button action in the certification requirements and how to deal with that in a web browser app. Multiplayer-enabling my Windows Phone 7 game: Day 1 Samuel Jack challenged himself to build a WP7 game in 3 days... now he's challenging himself to make it multiplayer in 3 days... this first hour-to-hour post is research of networking and an azure server-side solution. Multiplayer-enabling my Windows Phone 7 game: Day 2–Building a UI with XPF Day 2 for Samuel Jack getting the multiplayer portion of his game working in 3 days.. this day involves getting up-to-speed with XPF. How to Hotwire your WP7 Phone Battery Did you realize if you run your WP7 battery completely down that you can't charge it? James Ashley reports that circumstance, and how he resolved it. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Dissing Architects, or "What's wrong with the coffee?"

    - by Bob Rhubart
    In my conversations with people in architect roles, tales of animosity, disrespect, and outright hostility aren't uncommon. And it's clear that in more than a few organizations architects regularly face a tough uphill climb. For architects with the requisite combination of technical, organizational, and people skills, that rough treatment is grossly undeserved. But tales of unqualified people in positions up and down the IT food chain are also easy to come by. So what's the other side of the architect story? Are some architects tarnishing the role and making life miserable for their more qualified colleagues? The various quotes included below were culled from a variety of sources. The criticism is harsh, and the people behind these quotes clearly have issues with architects. Still, whether based on mere opinion or actual experience, the comments shed some light on behaviors that should raise red flags for anyone pursuing a career as an architect. If you're an architect, and you've ever noticed that your coffee tastes like window cleaner, or your car is repeatedly keyed, or no one ever holds the elevator for you, maybe you need to do a little soul searching... Those Who Can, Code; Those Who Can't, Architect | Joe Winchester [May 18, 2007] "At the moment there seems to be an extremely unhealthy obsession in software with the concept of architecture. A colleague of mine, a recent graduate, told me he wished to become a software architect. He was drawn to the glamour of being able to come up with grandiose ideas - sweeping generalized designs, creating presentations to audiences of acronym addicts, writing esoteric academic papers, speaking at conferences attended by headless engineers on company expense accounts hungrily seeking out this year's grail, and creating e-mails with huge cc lists from people whose signature footer is more interesting than the content. I tried to re-orient him into actually doing some coding, to join a team that has a good product and keen users both of whom are pushing requirements forward, to no avail. Somehow the lure of being an architecture astronaut was too strong and I lost him to the dark side." Don't Let Architecture Astronauts Scare You | Joel Spolsky [April 21, 2001] "It's very hard to get them to write code or design programs, because they won't stop thinking about Architecture. They're astronauts because they are above the oxygen level, I don't know how they're breathing. They tend to work for really big companies that can afford to have lots of unproductive people with really advanced degrees that don't contribute to the bottom line. Remember that the architecture people are solving problems that they think they can solve, not problems which are useful to solve." Non Coding Architects Suck | Richard Henderson [May 24, 2010] "If a guy with a badge saying 'system architect' looks blank on low-level issues then he is not an architect, he is a business-analyst who went on a course. He will probably wax lyrical on all things high-level and 'important.' He will produce lovely object hierarchies without a clue to implementation. He will have a moustache and play golf." Architects Play Golf | Sunir Shah [August 15, 2012] "Often arrogant architects are difficult to get a hold of during the implementation phase because they no longer feel the need to stick around. Especially around midnight when most of the poor sob [sic] developers are still banging away. After all, they've already solved the problem--the rest is just an implementation exercise." Engineer vs Architect(Part of a discussion on the IT Architect Network Group on LinkedIn) "[An] architect spends his time producing white papers full of acronyms he does not understand but that impress his boss [while the] engineer keeps his head down and does the actual job." Architects Don't Code | [Author Unknown] "Faulty belief: System Architects don't need to code anymore. They know what they are talking about by virtue of the fact that they are System Architects."

    Read the article

  • 65536% Autogrowth!

    - by Tara Kizer
    Twice a year, we move our production systems to our disaster recovery site.  Last Saturday night was one of those days.  There are about 50 SQL Server databases to be moved to the DR site, which is done via database mirroring.  It takes only a few seconds to failover, but some databases have a bit more involved work such as setting up replication.  Everything went relatively smooth, but we encountered a weird bug on our most mission critical system.  After everything was successfully failed over to the DR site, it was noticed that mirroring was in a suspended state on one of the databases.  We thought we had run into a SQL Server 2005 bug that we had been encountering and were working with Microsoft on a fix.  Microsoft did fix it in both SQL Server 2005 service pack 3 cumulative update package 13 and service pack 4 cumulative update package 2, however SP3 CU13 and SP4 both recently failed on this system so we were not patched yet with the bug fix.  As the suspended state was causing us issues with replication, we dropped mirroring.  We then noticed we had 10MB of free disk space on the mount point where the principal’s data files are stored.  I knew something went amiss as this system should have at least 150GB free on that mount point.  I immediately checked the main database’s data file and was shocked to see an autgrowth size of 65536%.  The data file autogrew right before mirroring went into the suspended state. 65536%! I didn’t have a lot of time to research if this autgrowth problem was a known SQL Server bug, so I deferred that research to today.  A quick Google search yielded no results but emphasis on “quick”.  I checked our performance system, which was recently restored with a copy of the affected production database, and found the autogrowth setting to be 512MB.  So this autogrowth bug was encountered sometime in the last two weeks.  On February 26th, we had attempted to install SQL 2005 SP4 on production, however it had failed (PSS case open with Microsoft).  I suspected that the SP4 failure was somehow related to this autgrowth bug although that turned out not to be the case. I then tweeted (@TaraKizer) about this problem to see if the SQL Server community (#sqlhelp) had any insights.  It seems several people have either heard of this bug or encountered it.  Aaron Bertrand (blog|twitter) referred me to this Connect item. Our affected database originated on SQL Server 2000 and was upgraded to SQL Server 2005 in 2007.  Back on SQL Server 2000, we were using the default file growth setting which was a percentage.  Sometime after the 2005 upgrade is when we changed it to 512MB.  Our situation seemed to fit the bug Aaron referred to me, so now the question was whether Microsoft had fixed it yet. I received a reply to my tweet from Amit Banerjee (twitter) that it had been fixed in SP3 CU1 (KB958004).  My affected system is SP3 CU8, so I was initially confused why we had encountered the bug.  Because I don’t read things fully, I had missed that there are additional steps you have to follow after applying the bug fix.  Amit set me straight.  Although you can read this information in the KB article, I will also copy it here in case you are as lazy as me and miss the most important section of it (although if you are as lazy as me, you won’t have read this far down my blog post): This hotfix will prevent only future occurrences of this problem. For example, if you restore a database from SQL Server 2000 to a SQL Server 2005 instance that contains this hotfix, this problem will not occur. However, if you already have a database that is affected by this problem, you must follow these steps to resolve this problem manually: Apply this hotfix. Set the file growth settings for the affected files to percentage settings, and then set the settings back to megabyte settings. Take the database offline, and then bring it back online. Verify that the values of the is_percent_growth column are correct in the sys.database_files system table and in the sys.master_files system table.

    Read the article

  • Webcast Q&A: Demystifying External Authorization

    - by B Shashikumar
    Thanks to everyone who joined us on our webcast with SANS Institute on "Demystifying External Authorization". Also a special thanks to Tanya Baccam from SANS for sharing her experiences reviewing Oracle Entitlements Server. If you missed the webcast, you can catch a replay of the webcast here.  Here is a compilation of the slides that were used on today's webcast.  SANS Institute Product Review: Oracle Entitlements Server We have captured the Q&A from the webcast for those who couldn't attend. Q: Is Oracle ADF integrated with Oracle Entitlements Server (OES) ? A:  In Oracle Fusion Middleware 11g and later, Oracle ADF, Oracle WebCenter, Oracle SOA Suite and other middleware products are all built on Oracle Platform Security Services (OPSS). OPSS privodes many security functions like authentication, audit, credential stores, token validaiton, etc. OES is the authorization solution underlying OPSS. And OES 11g unifies different authorization mechanisms including Java2/ABAC/RBAC.  Q: Which portal frameworks support the use of OES policies for portal entitlement decisions? A:  Many portals including Oracle WebCenter 11g  run natively on top of OES. The authorization engine in WebCenter is OES. Besides, OES offers out of the box integration with Microsoft SharePoint. So SharePoint sites, sub sites, web parts, navigation items, document access control can all be secured with OES. Several other portals have also been secured with OES ex: IBM websphere portal Q:  How do we enforce Seperation of Duties (SoD) rules using OES (also how does that integrate with a product like OIA) ? A:  A product like OIM or OIA can be used to set up and govern SoD policies. OES enforces these policies at run time. Role mapping policies in OES can assign roles dynamically to users under certain conditions. So this makes it simple to enforce SoD policies inside an application at runtime. Q:  Our web application has objects like buttons, text fields, drop down lists etc. is there any ”autodiscovery” capability that allows me to use/see those web page objects so you can start building policies over those objects? or how does it work? A:  There ae few different options with OES. When you build an app, and make authorization calls with the app in the test environment, you can put OES in discovery mode and have OES register those authorization calls and decisions. Instead of doing  this after the fact, an application like Oracle iFlex has built-in UI controls where when the app is running, a script can intercept authorization calls and migrate those over to OES. And in Oracle ADF, a lot of resources are protected so pages, task flows and other resources be registered without OES knowing about them. Q: Does current Oracle Fusion application use OES ? The documentation does not seem to indicate it. A:  The current version of Fusion Apps is using a preview version of OES. Soon it will be repalced with OES 11g.  Q: Can OES secure mobile apps? A: Absolutely. Nowadays users are bringing their own devices such as a a smartphone or tablet to work. With the Oracle IDM platform, we can tie identity context into the access management stack. With OES we can make use of context to enforce authorization for users accessing apps from mobile devices. For example: we can take into account different elements like authentication scheme, location, device type etc and tie all that information into an authorization decision.  Q:  Does Oracle Entitlements Server (OES) have an ESAPI implementation? A:  OES is an authorization solution. ESAPI/OWASP is something we include in our platform security solution for all oracle products, not specifically in OES Q:  ESAPI has an authorization API. Can I use that API to access OES? A:  If the API supports an interface / sspi model that can be configured to invoke an external authz system through some mechanism then yes

    Read the article

  • My search what the Cloud will mean for my Work, part 2

    - by Kay Sellenrode
    My experience with the cloud and why work will change and not disappear. Until now I have multiple experiences with the cloud, for the most good. i have worked on multiple cloud solutions in the past but let me describe them as 0.x versions. For me the 1st real serious cloud experience was a bit more than 1 year ago, when our company switched from an in house server to Microsoft BPOS as a complete replacement. Since we are a small consultancy firm and don’t have that much else to do than consulting, our IT requirements are quite simple. We need Mail and Storage space for our documents. With the in house server we had multiple outages during a year, mostly by lack of administering. Being consultants in the field and hardly having time to maintain a server, BPOS was and still is for us the right solution. Since the migration we have less outages and a much more robust solution. Have we run into issues with BPOS for our own environment? No not that I’m aware of. Based on this experience I made a stance about deploy ability of BPOS and cloud solutions, they are suitable for MKB (Dutch for Medium and Small Businesses). Most Small businesses don’t have the amount of work to hire a full time it admin. Hiring a service provider to maintain their own server might be even more costly than hiring an admin. So seeing the capabilities of BPOS and the needs of most businesses I see it as a great solution that gives the business a complete Server replacement solution for a fixed price per user. resulting in a clear budget for IT spending, something most small businesses were looking for, for a long time. So right now I’m deploying BPOS with a customer, and I run into some of the Cloud 1.0 issues. In my opinion BPOS is a good working Cloud version 1.0 solution. What do I mean with 1.0? Well 1.0 is mostly a tested solution (unlike 0.x versions) but still have quite some limitations caused by too few market experience. in my opnion this is also the reason why we don’t see that much BPOS customers yet and why I think Office 365 will make a huge difference. What I have seen of 365 shows me it is a Cloud 2.0 version, meaning it has all needed features and is much more flexible to the customer. This is also why I see changes happen in my work field, changes and not unemployment due to Cloud solutions. Cloud 1.0 solutions gave me the idea that if every customer would adopt them I would be out of work. But in reality Cloud 1.0 solutions are here just to set the market needs. The Cloud 2.0 and higher versions will give the customer much more flexibility, but also require the need for a consultant. Where the 1.0 versions are simple to setup and maintain, the 2.0 solution needs more thought upfront and afterwards. ie. BPOS in its 1.0 version brings you a very simplified Exchange 2007 solution, Suitable for some customers. Looking at Office 365 you receive almost a full blown Exchange 2010 solution. I expect this to be even more customizable in the next version. In my search for the changes to my work I try to regulary write a post with my thought around the Cloud and the impact on my work as a consultant. I'm also planning to present around this topic, so if anyone is interested to see me present around this topic, you're more than welcome to contact me.

    Read the article

  • Visual Studio 2012 and .NET 4.5 now Live!

    - by Tarun Arora
    Today was the formal launch event for Visual Studio 2012 and .NET 4.5, a state-of-the-art development solution for building modern applications that span connected devices and continuous services, from the client to the cloud. The event was streamed live from http://visualstudiolaunch.com, S.Somasegar corporate vice president of the Developer Division opened the key note, Jason Zander dived deeper into how to leverage Visual Studio 2012 and .NET 4.5 to build modern application. Brian Harry all the awesome features in Visual Studio 2012 to improve the application lifecycle management.   I. Summary of the announcements made today 1. Visual Studio Updates coming this fall –  VS Update will better support agile teams, enable continuous quality, elevate SharePoint development with application lifecycle management (ALM) tools, and expand Visual Studio 2012 Windows development capabilities. It will be available as a community technology preview (CTP) later this month and in final release later this calendar year. A comprehensive list of what will be on offer can be found here. 2. Visual Studio Express 2012 for Windows Desktop – Visual Studio Express 2012 for Windows Desktop brings the newest desktop development capabilities in Visual Studio 2012 to Express users, too. You would be excited to know that the express SKU will support Integration with TFS among some of the other cool features I would like to mention Unit Testing, Unit Testing, Code Analysis, dependency management with NuGet a full list and download links can be found here. 3. F# tools for Visual Studio Express 2012 for web –  This F# Tools release adds in F# 3.0 components, such as the F# 3.0 compiler, F# Interactive, IDE support, and new F# features such as type providers and query expressions to your Visual Studio 2012 express for web. More details and download links can be found here. 4. Visual Studio TFS 2012 Power Tools – The TFS 2012 Power tools brings the goodness of Best Practice Analyzer, Process Template Editor, Storyboard Shapes, Team Explorer enhancements, TFPT command line, TFS Server Backups, etc via to your TFS 2012 installation. It can be downloaded right away from here. II. Road shows There will be many more community road shows this month packaged with hours of demos and discussions. The Visual Studio UK Team has just announced that there will be four UK launch events, face to face session including a product group speaker and partner sessions: Edinburgh, 1st October Manchester, 3rd October London, 4th October Reading, 5th October III. Get Started Download Visual Studio 2012 and the additional supporting software's from here. The Visual Studio development team has put together over 60 videos to help you learn about the new Visual Studio 2012 capabilities in more detail, and all of these will be available for watching here. IV. What’s Next A lot more exciting stuff lined up… Windows 8 Anticipated release: Oct. 26 (UPDATED 9/12) Windows Server 2012 Released (UPDATED 9/4) System Center 2012 Released (UPDATED 9/11) SQL Server 2012 Released (UPDATED 4/2) Internet Explorer 10 Anticipated release: Between Q3 2012 and early 2013 (UPDATED 5/3   Office 2013 Anticipated release: Q4 2012 or Q1 2013(UPDATED 9/12) Exchange 2013 Anticipated release: Q4 2012 (UPDATED 7/26) Visual Studio 2012 Released (UPDATED 9/12) Kinect for Windows Released (UPDATED 9/4) Windows Phone "Tango" and 8 "Tango": Released; Anticipated "Windows Phone 8" release: Q4 2012 (UPDATED 9/5) Dynamics ERP Online Anticipated release: September or October 2012 (UPDATED 7/20) Office 365 Anticipated update schedule: "Almost weekly"(UPDATED 9/12) Windows Azure Rumored CTP release: Spring 2012 (UPDATED 9/7) SharePoint 2013 Anticipated release: Q4 2012 (UPDATED 8/21) Enjoy

    Read the article

  • Opening a new Windows from ASP.NET code behind

    - by TATWORTH
    At http://weblogs.asp.net/infinitiesloop/archive/2007/09/25/response-redirect-into-a-new-window-with-extension-methods.aspx there is an excellent post on how to open a new windows from code behind. The purists may not like it but it helped solve a problem for a client's client. Here is an update for VS2010 users: using System; using System.Web; using System.Web.UI; /// <summary> /// Response Helper for opening popup windo from code behind. /// </summary> public static class ResponseHelper {   /// <summary>   /// Redirect to popup window   /// </summary>   /// <param name="response">The response.</param>   /// <param name="url">URL to open to</param>   /// <param name="target">Target of window _self or _blank</param>   /// <param name="windowFeatures">Features such as window bar</param>   /// <remarks>   ///     <list type="bullet">   ///         <item>   /// From http://weblogs.asp.net/infinitiesloop/archive/2007/09/25/response-redirect-into-a-new-window-with-extension-methods.aspx   /// </item>   /// <item>   /// Note: If you use it outside the context of a Page request, you can't redirect to a new window. The reason is the need to call the ResolveClientUrl method on Page, which I can't do if there is no Page. I could have just built my own version of that method, but it's more involved than you might think to do it right. So if you need to use this from an HttpHandler other than a Page, you are on your own.   /// </item>   ///         <item>   /// Beware of popup blockers.   /// </item>   /// <item>   /// Note: Obviously when you are redirecting to a new window, the current window will still be hanging around. Normally redirects abort the current request -- no further processing occurs. But for these redirects, processing continues, since we still have to serve the response for the current window (which also happens to contain the script to open the new window, so it is important that it completes).   /// </item>   /// <item>   /// Sample call Response.Redirect("popup.aspx", "_blank", "menubar=0,width=100,height=100");   /// </item>   ///     </list>   /// </remarks>   public static void Redirect(this HttpResponse response, string url, string target, string windowFeatures)   {     if ((String.IsNullOrEmpty(target) || target.Equals("_self", StringComparison.OrdinalIgnoreCase)) && String.IsNullOrEmpty(windowFeatures))     {       response.Redirect(url);     }     else     {       Page page = (Page)HttpContext.Current.Handler;       if (page == null)       {         throw new InvalidOperationException("Cannot redirect to new window outside Page context.");       }       url = page.ResolveClientUrl(url);       string script;       if (!String.IsNullOrEmpty(windowFeatures))       {         script = @"window.open(""{0}"", ""{1}"", ""{2}"");";       }       else       {         script = @"window.open(""{0}"", ""{1}"");";       }       script = String.Format(script, url, target, windowFeatures);       ScriptManager.RegisterStartupScript(page, typeof(Page), "Redirect", script, true);     }   } }

    Read the article

  • PECL OCI8 2.0 Production Release Announcement

    - by cj
    The PHP OCI8 2.0.6 extension for Oracle Database is now "production" status. The source code is available on PECL. This can be used immediately to update your OCI8 extension in PHP 5.2 and later versions. The extension compiles with Oracle 10.2 or later client libraries. Oracle's standard cross-version database connectivity applies. OCI8 2.0 and PHP 5.5.5 RPMs for Oracle and Red Hat Linux are available from oss.oracle.com. Windows DLLs are available on PECL for PHP 5.3, PHP 5.4 and PHP 5.5. OCI8 2.0 source code will also be automatically included in the next major version of PHP. New Functionality Oracle Database 12c Implicit Result Set support. IRS's make it easy to pass query results back from stored PL/SQL procedures or anonymous PL/SQL blocks. Individual IRS statement resources, each corresponding to a single query, can be obtained with the new function oci_get_implicit_resultset(). These 'child' statement resources can be passed to any oci_fetch_* function. See Using PHP and Oracle Database 12c Implicit Result Sets and the PHP Manual: oci_get_implicit_resultset(). DTrace Dynamic Trace static probes. This well respected DTrace tracing framework is available on a number of platforms, including Oracle Linux. PHP OCI8 static user-space probes can be enabled with PHP's --enable-dtrace configuration option. See Using PHP DTrace on Oracle Linux. Documentation is also available in the PHP Manual OCI8 and DTrace Dynamic Tracing Improved Functionality Using oci_execute($s, OCI_NO_AUTO_COMMIT) for a SELECT no longer unnecessarily initiates an internal ROLLBACK during connection close. This can improve overall scalability by reducing "round trips" between PHP and the database. Changed Functionality PHP OCI8 2.0's minimum pre-requisites are now PHP 5.2 and Oracle client library 10.2. Later versions of both are usable and, in fact, recommended. Use the older PHP OCI8 1.4.10 extension when using PHP 4.3.9 through to PHP 5.1.x, or when only Oracle Database 9.2 client libraries are available. oci_set_*($connection, ...) meta data setting call error handling is fixed so that oci_error($connection) works for these calls. Note: The old, deprecated function aliases like ocilogon still exist but are not recommended for new applications. Phpinfo() Changes Some cosmetic changes were made to the output of php --ri oci8 and the phpinfo() function. The oci8.event and oci8.connection_class values are now shown only when the Oracle client libraries support the respective functionality. Connection statistics are now in a separate phpinfo() table. Temporary LOB and Collection support status lines in phpinfo() output were removed. These two features have always been enabled since 2007. Oci_internal_debug() Changes The oci_internal_debug() function is now a no-op. Use PHP's --enable-dtrace functionality with DTrace or SystemTap instead. References OCI8 Extension source code and Windows DLLs http://pecl.php.net/package/oci8 Oracle Linux RPMs oss.oracle.com PHP Manual for OCI8 OCI8 and DTrace Dynamic Tracing Oracle OpenWorld Conference paper What's New in Oracle Database 12c for PHP

    Read the article

  • Future Of F# At Jazoon 2011

    - by Alois Kraus
    I was at the Jazoon 2011 in Zurich (Switzerland). It was a really cool event and it had many top notch speaker not only from the Microsoft universe. One of the most interesting talks was from Don Syme with the title: F# Today/F# Tomorrow. He did show how to use F# scripting to browse through open databases/, OData Web Services, Sharepoint, …interactively. It looked really easy with the help of F# Type Providers which is the next big language feature in a future F# version. The object returned by a Type Provider is used to access the data like in usual strongly typed object model. No guessing how the property of an object is called. Intellisense will show it just as you expect. There exists a range of Type Providers for various data sources where the schema of the stored data can somehow be dynamically extracted. Lets use e.g. a free database it would be then let data = DbProvider(http://.....); data the object which contains all data from e.g. a chemical database. It has an elements collection which contains an element which has the properties: Name, AtomicMass, Picture, …. You can browse the object returned by the Type Provider with full Intellisense because the returned object is strongly typed which makes this happen. The same can be achieved of course with code generators that use an input the schema of the input data (OData Web Service, database, Sharepoint, JSON serialized data, …) and spit out the necessary strongly typed objects as an assembly. This does work but has the downside that if the schema of your data source is huge you will quickly run against a wall with traditional code generators since the generated “deserialization” assembly could easily become several hundred MB. *** The following part contains guessing how this exactly work by asking Don two questions **** Q: Can I use Type Providers within C#? D: No. Q: F# is after all a library. I can reference the F# assemblies and use the contained Type Providers? D: F# does annotate the generated types in a special way at runtime which is not a static type that C# could use. The F# type providers seem to use a hybrid approach. At compilation time the Type Provider is instantiated with the url of your input data. The obtained schema information is used by the compiler to generate static types as usual but only for a small subset (the top level classes up to certain nesting level would make sense to me). To make this work you need to access the actual data source at compile time which could be a problem if you want to keep the actual url in a config file. Ok so this explains why it does work at all. But in the demo we did see full intellisense support down to the deepest object level. It looks like if you navigate deeper into the object hierarchy the type provider is instantiated in the background and attach to a true static type the properties determined at run time while you were typing. So this type is not really static at all. It is static if you define as a static type that its properties shows up in intellisense. But since this type information is determined while you are typing and it is not used to generate a true static type and you cannot use these “intellistatic” types from C#. Nonetheless this is a very cool language feature. With the plotting libraries you can generate expressive charts from any datasource within seconds to get quickly an overview of any structured data storage. My favorite programming language C# will not get such features in the near future there is hope. If you restrict yourself to OData sources you can use LINQPad to query any OData enabled data source with LINQ with ease. There you can query Stackoverflow with The output is also nicely rendered which makes it a very good tool to explore OData sources today.

    Read the article

  • Deliberate Practice

    - by Jeff Foster
    It’s easy to assume, as software engineers, that there is little need to “practice” writing code. After all, we write code all day long! Just by writing a little each day, we’re constantly learning and getting better, right? Unfortunately, that’s just not true. Of course, developers do improve with experience. Each time we encounter a problem we’re more likely to avoid it next time. If we’re in a team that deploys software early and often, we hone and improve the deployment process each time we practice it. However, not all practice makes perfect. To develop true expertise requires a particular type of practice, deliberate practice, the only goal of which is to make us better programmers. Everyday software development has other constraints and goals, not least the pressure to deliver. We rarely get the chance in the course of a “sprint” to experiment with potential solutions that are outside our current comfort zone. However, if we believe that software is a craft then it’s our duty to strive continuously to raise the standard of software development. This requires specific and sustained efforts to get better at something we currently can’t do well (from Harvard Business Review July/August 2007). One interesting way to introduce deliberate practice, in a sustainable way, is the code kata. The term kata derives from martial arts and refers to a set of movements practiced either solo or in pairs. One of the better-known examples is the Bowling Game kata by Bob Martin, the goal of which is simply to write some code to do the scoring for 10-pin bowling. It sounds too easy, right? What could we possibly learn from such a simple example? Trust me, though, that it’s not as simple as five minutes of typing and a solution. Of course, we can reach a solution in a short time, but the important thing about code katas is that we explore each technique fully and in a controlled way. We tackle the same problem multiple times, using different techniques and making different decisions, understanding the ramifications of each one, and exploring edge cases. The short feedback loop optimizes opportunities to learn. Another good example is Conway’s Game of Life. It’s a simple problem to solve, but try solving it in a functional style. If you’re used to mutability, solving the problem without mutating state will push you outside of your comfort zone. Similarly, if you try to solve it with the focus of “tell-don’t-ask“, how will the responsibilities of each object change? As software engineers, we don’t get enough opportunities to explore new ideas. In the middle of a development cycle, we can’t suddenly start experimenting on the team’s code base. Code katas offer an opportunity to explore new techniques in a safe environment. If you’re still skeptical, my challenge to you is simply to try it out. Convince a willing colleague to pair with you and work through a kata or two. It only takes an hour and I’m willing to bet you learn a few new things each time. The next step is to make it a sustainable team practice. Start with an hour every Friday afternoon (after all who wants to commit code to production just before they leave for the weekend?) for month and see how that works out. Finally, consider signing up for the Global Day of Code Retreat. It’s like a daylong code kata, it’s on December 8th and there’s probably an event in your area!

    Read the article

  • DBA Best Practices - A Blog Series: Episode 2 - Password Lists

    - by Argenis
      Digital World, Digital Locks One of the biggest digital assets that any company has is its secrets. These include passwords, key rings, certificates, and any other digital asset used to protect another asset from tampering or unauthorized access. As a DBA, you are very likely to manage some of these assets for your company - and your employer trusts you with keeping them safe. Probably one of the most important of these assets are passwords. As you well know, the can be used anywhere: for service accounts, credentials, proxies, linked servers, DTS/SSIS packages, symmetrical keys, private keys, etc., etc. Have you given some thought to what you're doing to keep these passwords safe? Are you backing them up somewhere? Who else besides you can access them? Good-Ol’ Post-It Notes Under Your Keyboard If you have a password-protected Excel sheet for your passwords, I have bad news for you: Excel's level of encryption is good for your grandma's budget spreadsheet, not for a list of enterprise passwords. I will try to summarize the main point of this best practice in one sentence: You should keep your passwords on an encrypted, access and version-controlled, backed-up, well-known shared location that every DBA on your team is aware of, and maintain copies of this password "database" on your DBA's workstations. Now I have to break down that statement to you: - Encrypted: what’s the point of saving your passwords on a file that any Windows admin with enough privileges can read? - Access controlled: This one is pretty much self-explanatory. - Version controlled: Passwords change (and I’m really hoping you do change them) and version control would allow you to track what a previous password was if the utility you’ve chosen doesn’t handle that for you. - Backed-up: You want a safe copy of the password list to be kept offline, preferably in long term storage, with relative ease of restoring. - Well-known shared location: This is critical for teams: what good is a password list if only one person in the team knows where it is? I have seen multiple examples of this that work well. They all start with an encrypted database. Certainly you could leverage SQL Server's native encryption solutions like cell encryption for this. I have found such implementations to be impractical, for the most part. Enter The World Of Utilities There are a myriad of open source/free software solutions to help you here. One of my favorites is KeePass, which creates encrypted files that can be saved to a network share, Sharepoint, etc. KeePass has UIs for most operating systems, including Windows, MacOS, iOS, Android and Windows Phone. Other solutions I've used before worth mentioning include PasswordSafe and 1Password, with the latter one being a paid solution – but wildly popular in mobile devices. There are, of course, even more "enterprise-level" solutions available from 3rd party vendors. The truth is that most of the customers that I work with don't need that level of protection of their digital assets, and something like a KeePass database on Sharepoint suits them very well. What are you doing to safeguard your passwords? Leave a comment below, and join the discussion! Cheers, -Argenis

    Read the article

  • Perfect End to a Bad Day

    - by TehGrumpyCoder
    Yesterday's post about A Bad Day at Work actually had an addendum to it. There were apparently a bunch of guys on ice skates last night competing in some sport way the hell and gone over on the other side of the valley, and enough people couldn't live without seeing them that they had all major arteries heading west honked. I mean honked... the traffic guy reported the 101 had 16 miles of backup... yikes. Since I worked downtown for a number of years, my fallback is to cut across the city on surface streets to get to one of my old 'haunts' and just drive it home from there. Of course with the 101 backed up, then I17 would logically be as well, so I kept the news on rather than my Zune and heard where the bad stuff was going North. I popped out on the freeway about 7 miles south of my exit. Got to the exit which is about a mile from the house without killing or maiming me or anyone else. Waited patiently at the light in the inside lane to make a left and go under the freeway proceeding West. The light changed, I had full green, I started through and whoa... I've got someone in a little rat car crossing my bow! A little explanation... I drive a 3/4 ton pickup with a V-10, extended cab and shell on the back. It's not jacked up, but it sits up pretty good and is longer than any parking place I've ever tried to put it into. I consider this truck to be the consolation prize for paying uninsured motorist coverage for 45 years and having Pilar Martinez totally destroy a 3/4 ton Silverado on March 1, 2007 by plowing into me at traffic speed while I was stopped at a light. If you pay for uninsured motorist coverage, ask your insurance agent *exactly* what that means... I bet it's different than what you think it means. But I digress, sorry... So here I am with a car that is shorter from top to road than the hood on my truck, and the driver thought it would be safe to run a red light and see if they could get past me before I got into the lane. The right side of my front bumper was almost into the driver's window when I hit the brakes and wheeled it left. Fortunately for all involved, I saw it soon enough, and pulled into the 2nd lane for making a left to go back South. I looked in my mirror, signalled a move, then moved over behind the yuck in the rat car. I then punched it, and the future hood ornament and I both made it through the next light. I pulled alongside to let her know that she was DEFINITELY Number 1 in my book, and it's a middle-age woman looking at me with a "sorry, it was an accident" show of pouty face and arms held up. Tough $hit lady... that may have worked when you were 18, but it's not working anymore, and it wasn't an accident... you ran a freakin' red light and almost got yourself killed. That just about put a bow on the day... I was home later than usual, pissed off about work stuff, pissed off at traffic, and now that. I ate dinner, watched a little TV, and was asleep about 9:30 exhausted. Hope today is better.

    Read the article

  • GMLib Could not complete the operation due to error 80020101

    - by Pierrie
    I get this error "Could not complete the operation due to error 80020101." at random times when displaying a map with a marker on it. I use Delphi 2007 and GMLib [1.2.0 Final]. I have read up on the issue and some suggestions was that the problem is due to commenting or bad syntax in java code, and it was suggested that i take out all the commenting and check for errors in the java code. This i did, i recompiled and reinstalled GMLib after modifying the map.html file. I stripped it of all commenting and parsed it through ie for faults but found none, as expected. But the problem still occurs. Here is a sample of my code to show the map and add the marker : Var newmarker : TMarker; begin newmarker := GMMarker1.Add(); newmarker.Position.Lat := MarkersToPaint[i].Latitude; newmarker.Position.Lng := MarkersToPaint[i].Longitude; newmarker.Visible := True; newmarker.Title := MarkersToPaint[i].Title; GMMap1.RequiredProp.Center.Lat := midlat; GMMap1.RequiredProp.Center.Lng := midlong; GMMap1.RequiredProp.Zoom := 18; GMMarker1.ShowElements; GMMap1.Active := True; Any help in this matter will be greatly appreciated.

    Read the article

  • T-SQL: Compute Subtotals For A Range Of Rows

    - by John Dibling
    MSSQL 2008. I am trying to construct a SQL statement which returns the total of column B for all rows where column A is between 2 known ranges. The range is a sliding window, and should be recomputed as it might be using a loop. Here is an example of what I'm trying to do, much simplified from my actual problem. Suppose I have this data: table: Test Year Sales ----------- ----------- 2000 200 2001 200 2002 200 2003 200 2004 200 2005 200 2006 200 2007 200 2008 200 2009 200 2010 200 2011 200 2012 200 2013 200 2014 200 2015 200 2016 200 2017 200 2018 200 2019 200 I want to construct a query which returns 1 row for every decade in the above table, like this: Desired Results: DecadeEnd TotalSales --------- ---------- 2009 2000 2010 2000 Where the first row is all the sales for the years 2000-2009, the second for years 2010-2019. The DecadeEnd is a sliding window that moves forward by a set ammount for each row in the result set. To illustrate, here is one way I can accomplish this using a loop: declare @startYear int set @startYear = (select top(1) [Year] from Test order by [Year] asc) declare @endYear int set @endYear = (select top(1) [Year] from Test order by [Year] desc) select @startYear, @endYear create table DecadeSummary (DecadeEnd int, TtlSales int) declare @i int -- first decade ends 9 years after the first data point set @i = (@startYear + 9) while @i <= @endYear begin declare @ttlSalesThisDecade int set @ttlSalesThisDecade = (select SUM(Sales) from Test where(Year <= @i and Year >= (@i-9))) insert into DecadeSummary values(@i, @ttlSalesThisDecade) set @i = (@i + 9) end select * from DecadeSummary This returns the data I want: DecadeEnd TtlSales ----------- ----------- 2009 2000 2018 2000 But it is very inefficient. How can I construct such a query?

    Read the article

  • Visual Studio 2008 SignTool.exe not found

    - by Maslow
    I can't publish in 2008, I was previously using 2005 and it published just fine. Error 2 An error occurred while signing: SignTool.exe not found. I know there are tons of hits for a search on signtool.exe on google. The ones I've found involve copying the file to X,Y,Z locations and ensuring signtool matches up with your VS command prompt path. When I run my start- program files - visual studio 2008 - Visual Studio Tools - Visual Studio Command prompt. and type signtool.exe it finds the file just fine. I have Visual Studio 2005 professional edition, Visual studio 2008 profession edition, Visual studio 2005 SDK February 2007, just installed Visial Studio 2008 SDK1.1 to see if that would fix it, no luck. I have copied signtool.exe to lots of places that were suggested on the google searches, it is now located at all of the following: C:\Program Files\Visual Studio 2005 SDK\2007.02 C:\Program Files\Microsoft Visual Studio 9.0\Common7\Tools C:\Program Files\Microsoft Visual Studio 9.0\VB\Bin C:\WINNT\Microsoft.NET\Framework\v3.5 C:\Program Files\Microsoft SDKs\Windows\v6.0A\Bin C:\Program Files\Microsoft Visual Studio 8\Common7\Tools\Bin C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin C:\Program Files\Microsoft Visual Studio 9.0\SDK\v3.5\Bin C:\Program Files\Microsoft Visual Studio 9.0\VB\Bin\1033 C:\Program Files\Visual Studio 2005 SDK\2007.02\VisualStudioIntegration\Tools\Bin I'm on windows XP 2009-06-12 update I can only publish if I copy signtool.exe to the project folder I'm publishing now.

    Read the article

  • Strong Naming an assembly using command line compile

    - by David
    I am trying to use NAnt in order to compile and sign an assembly using the vbc compiler. I have a project set up and am able to successfully sign the assembly compiling with VS2010. When I try to sign it using the command line I get this error: vbc : error BC30140: Error creating assembly manifest: Error signing assembly -- The parameter is incorrect. I even created a trivially simple app (just an assemblyinfo.vb file) that will not compile and sign using vbc.exe What am I doing wrong? here is my assemblyinfo.vb: Option Strict Off Option Explicit On Imports System Imports System.Reflection <Assembly: AssemblyVersionAttribute("2010.05.18.0918"), _ Assembly: AssemblyCopyrightAttribute("Copyright © Patient First 2007"), _ Assembly: AssemblyCompanyAttribute("Patient First, Inc."), _ Assembly: AssemblyProductAttribute("Patient First Framework"), _ Assembly: AssemblyDelaySign(false), _ Assembly: AssemblyKeyFile("test.pfx"), _ Assembly: AssemblyTitleAttribute("PatientFirst.Framework")> test.pfx is located in the same folder as assemblyinfo.vb Here is how I am trying to compile it: vbc /target:library /verbose assemblyinfo.vb I also tried using vbc /target:library /verbose assemblyinfo.vb /keyfile:test.pfx and tried using /keyfile parameter without the AssemblyDelaySign and AssemblyKeyFile attributes If I remove the AssemblyDelaySign and AssemblyKeyFile attributes and leave off the /keyfile command line parameter it compiles fine. What is the correct way to do this with vbc? --EDIT: I have found that MSBuild also does not like having the AssemblyKeyFile attribute as I have defined it in the AssemblyInfo.vb, it gives the same failure message. So the only way I can currently get this to build correctly is to set properties on the project to tell it which key file to use and to sign the assembly.

    Read the article

  • What is the difference between System.Speech.Recognition and Microsoft.Speech.Recognition?

    - by Michael
    There are two similar namespaces and assemblies for speech recognition in .NET. I’m trying to understand the differences and when it is appropriate to use one or the other. There is System.Speech.Recognition from the assembly System.Speech (in System.Speech.dll). System.Speech.dll is a core DLL in the .NET Framework class library 3.0 and later There is also Microsoft.Speech.Recognition from the assembly Microsoft.Speech (in microsoft.speech.dll). Microsoft.Speech.dll is part of the UCMA 2.0 SDK I find the docs confusing and I have the following questions: System.Speech.Recognition says it is for "The Windows Desktop Speech Technology", does this mean it cannot be used on a server OS or cannot be used for high scale applications? The UCMA 2.0 Speech SDK ( http://msdn.microsoft.com/en-us/library/dd266409%28v=office.13%29.aspx ) says that it requires Microsoft Office Communications Server 2007 R2 as a prerequisite. However, I’ve been told at conferences and meetings that if I do not require OCS features like presence and workflow I can use the UCMA 2.0 Speech API without OCS. Is this true? If I’m building a simple recognition app for a server application (say I wanted to automatically transcribe voice mails) and I don’t need features of OCS, what are the differences between the two APIs?

    Read the article

< Previous Page | 229 230 231 232 233 234 235 236 237 238 239 240  | Next Page >