Search Results

Search found 6185 results on 248 pages for 'aspx'.

Page 127/248 | < Previous Page | 123 124 125 126 127 128 129 130 131 132 133 134  | Next Page >

  • XBRL US Conference Highlights

    - by john.orourke(at)oracle.com
    Back in early November I had an opportunity to attend the XBRL US National Conference in Philadelphia.  At the event, XBRL US announced that Oracle had joined the initiative, so I had a chance to participate in a press conference and attend a number of sessions.  Oracle joined XBRL US so we can stay ahead of the standard and leverage it in our products, and to help drive awareness with customers and improve adoption of XBRL. There were roughly 250 attendees at the event, about half of which were vendors and consultants and the rest financial reporting staff from corporate filers.  Event sponsors included Ernst & Young, SWIFT and Fujitsu.  There were also a number of XBRL technology and service providers exhibiting at the conference.  On Monday Nov. 8th, the XBRL US Steering Committee meetings and Annual Members meeting and reception were held.  At the Annual Members meeting the big news was that current XBRL US President, Mark Bolgiano, is moving to a new position at Howard Hughes Medical Center.  Campbell Pryde, who had led the Taxonomy Development for XBRL US, is taking over as XBRL US President. Other items that were highlighted at the members meeting included: The US GAAP XBRL taxonomy is being used by over 1500 SEC filers and has now been handed over to the FASB to maintain and enhance 16 filer training events were held in 2010 XBRL Global Magazine was launched Corporate Actions proposal was submitted to the SEC with SWIFT in May XBRL Labs for iPhone, XBRL US Consistency Suite launched ISO 2022 Corporate Actions Alignment with XBRL achieved The XBRL Credit Rating taxonomy was accepted Tuesday Nov. 9th included Keynotes, General Sessions, Innovation Workshop for Governments and Securities Professionals, and an Opening Reception.  General sessions included: Lessons Learned from the SEC's rollout of XBRL.  More than 18,000 errors were identified in reviews of filings between June 2009 and September 2010.  Most of these related to negative values being used where they shouldn't have.  Also, the SEC feels there are too many taxonomy extensions being created - mostly in the Cash Flow Statements.  They emphasize using existing elements in the US GAAP taxonomy and advise filers not to  create extensions to improve the visual formatting of XBRL filings. Investors and XBRL - Setting the Standard for Data Quality.  In this panel discussion, the key learning was that CFA's, academics and the financial community are not using XBRL as expected.  The issues raised include the  accuracy and completeness of filings, number of taxonomy extensions, and limited number of tools available to help analyze XBRL data.  Another big issue that was raised is the lack of historic results in XBRL - most analysts need 10 quarters of historic data.  On the positive side, XBRL has the potential to eliminate re-keying of data and errors here and can improve analytic capabilities for financial analysts once more historic data is available and more companies are providing detailed tagging of their filings. A US Roadmap for XBRL Financial Reporting.  This was a panel discussion featuring Jeff Neumann(SEC), Campbell Pryde(XBRL US), and Louis Matherne(FASB).  Key points included the fact that XBRL is currently used by 1500 companies, with 8000 more companies coming in 2011.  XBRL for Mutual Fund Reporting will start in 2011 for 8000 funds, and a Credit Rating Taxonomy has now been submitted for review.  The XBRL tagging/filing process is improving each quarter - more education is helping here.  The FASB is looking at extensions to date, and potential additions to US GAAP taxonomy, while the SEC is evaluating filings for accuracy, consistency in tagging, and tools for analyzing data.  The big news is that the FASB 2011 US GAAP Taxonomy has been completed and reviewed by SEC.  The 2011 US GAAP Taxonomy supports new FASB accounting standards issued since 2009, has new taxonomy elements for certain industries (i.e airlines) and the elimination of 500 concepts.  (meaning they can't be used going forward but are still supported for historical comparison)  The 2011 US GAAP Taxonomy will be available for usage with Q2 2011 SEC filings.  More information about this can be found on the FASB web site.  http://www.fasb.org/home Accounting Firms and XBRL.  This session covered the Role of Audit Firms, which includes awareness and education, validation of XBRL filings, and in-house transition planning.  The main advice provided was that organizations should document XBRL mapping process, perform peer comparisons, and risk assessments on a regular basis. Wednesday Nov. 10th included more Keynotes, General Sessions on Corporate Actions, and XBRL Essentials Workshop Training for corporate filers.  The XBRL Essentials Training included: Getting Started Once you Have the Basics Detailed Footnote Tagging and Handling Tables Quality Control and Trust in the XBRL Process Bringing XBRL In-House:  What are the Options, What should you consider? The US GAAP Financial Reporting Taxonomy - Overview of the 2011 release The XBRL Essentials Training was well-attended with about 80 people.  This included a good overview of the SEC's XBRL mandate, limited liability issue, tagging levels, recommended planning process, internal vs. outsourced approach, and how to manage service providers.  I learned a lot from the session on detailed tagging.  This is the requirement that kicks in during a company's second year of XBRL filing with the SEC and applies to financial statements, footnotes and disclosures (it does not apply to MD&A, executive communications and other information).  The review of the Linkbase model, or dimensional table structure, was very interesting and can be complex to understand.  The key takeaway here is that using dimensional tables in XBRL filings can help limit the number of taxonomy extensions that are required.  The slides from this session are posted on the XBRL US web site. (http://xbrl.us/events/Pages/archive.aspx) For me, the main summary points and takeaways from the XBRL US conference are: XBRL for financial reporting has turned the corner and gone mainstream - with 1500 companies currently using it and 8000 more coming in 2011 The expected value is not being achieved by filers or consumers of XBRL data - this will improve when more companies are filing in XBRL, more history is available, and more software tools are available for analysis (hmm, sounds like an opportunity for Oracle) XBRL is becoming the global standard for all business communications beyond just the financials - i.e. adoption for mutual funds, corporate actions and others planned for the future If you would like to learn more about XBRL and the various training programs, services and software tools that are available check out the XBRL US web site and even better - become a member.  Here's a link:  http://xbrl.us/Pages/default.aspx

    Read the article

  • CodePlex Daily Summary for Sunday, March 28, 2010

    CodePlex Daily Summary for Sunday, March 28, 2010New ProjectsFeed Tracker: Feed Tracker allows you to track your favorite feeds (RSS 2.0 and Atom 1.0) and open them up directly in your browser.FIM 2010 Resource Management Client: The Forefront Identity Management 2010 Resource Management Client is a library to communicate with the FIM 2010 web service. The development langu...Infection Protection: A game about controlling disease outbreak in a city. Developed for OGPC 2010, using Qt.OrthoLab: Homepage of Orthocone open-source laboratory.Paragliding ThermalMarker: Paragliding / Hanggliding Windows Application that receives waypoint files and returns only the thermals that get triggered more often in a place.RSSFalls: RssFalls makes it easier for developers to download RSS or Podcast enclosures.String Library for C++ Language: StrLib++ is a string library for C++ language. for now it support only ANSI strings, later Unicode support will added for UT8, UTF16 and UTF32 for...Sweeper: Sweeper is a Visual Studio 2008 add-in for C# that takes care of many of the trivial code-formatting issues that developers run into - particularly...System.Common: A .Net library that provides methods, properties and more that the .Net Framework doesn't provide.Tiveriad: The framework is designed to help you more easily build modular Windows applicationT-Shirts Online: Online shop build in Silverlight 4 using DIBS as payment module.New ReleasesArkSwitch: ArkSwitch v1.1.3: This release has some important changes. Thanks to MichyPrima for helping with some of the code. 1. Improved theming to more easily support multip...Catharsis: Catharsis 2.5 on catarsa.com: The Catharsis framework has finally its own portal http://catarsa.com The latest release version is 2.5 - string names of properties are not any ...EffiProz - A Pure C# Database: EffiProz CF 1.0: EffiProz for .Net compact framework.Encrypted Notes: Encrypted Notes 1.6: This is the latest version of Encrypted Notes (1.6). It has an installer - it will create a directory 'CPascoe' in My Documents. Once you have ext...Extend SmallBasic: Teaching Extensions v.009: Added Pentagon Crazy Recipe QuizGapi.NET - .NET (C#) wrapper for Google API: Gapi.NET 0.5.0.0: - Fixed some minor bugs. - Add minor features. - Performance improvement. See code check-ins for detailed informationHouseFly experimental controls: HouseFly experimental control: Alpha version of HouseFly experimental controlsiTuner - The iTunes Companion: iTuner 1.2.3738 Beta 2: V1.2 allows you to synchronize one or more iTunes playlists to a USB MP3 player. Beta 2 resolves all known issues. This continues the evolution ye...jQuery Library for SharePoint Web Services: SPServices 0.5.4: IMPORTANT NOTE: This release is in an alpha state. You should only download it if you know what you are getting and are interested in testing it f...JSINQ - LINQ to Objects for JavaScript: JSINQ 1.0: This is the first stable release of JSINQ. It is fully compatible with the 0.9 beta release. It contains the following new features: Now supports ...MSBuild Mercurial Tasks: 1.0.0 Beta: First release of the application. This version integrates all the basic functionalities of Mercurial as defined in the Use Case 1.Open Portal Foundation: Open Portal Foundation V1.4.2: What's news? noscript template was updated naming convention for layout autogenerated controls now use the "master" prefix. The documentation ce...Open Portal Foundation: Open Portal Foundation V1.4.4: What's news? ASP .NET Master page support for custom aspx integrated pages New usercontrols for ASCX, ASPX and Master page integration : Link: f...Paint.NET PSD Plugin: 1.5.0: RLE compression is now working fully on save. File sizes are now competitive with Photoshop's. Saving takes about twice as long with RLE compressi...Paragliding ThermalMarker: ThermalMarker_Alfa0.1: Release Alfa 1Simple Service Locator: Simple Service Locator v0.7: The Simple Service Locator is an easy-to-use Inversion of Control library that is a complete implementation of the Common Service Locator interface...String Library for C++ Language: Release 0.9: version 0.9 beta release DO NOT USE IN SERIOUS PROJECTS this release use default application heap, and because visual studio is using special debug...Sweeper: Sweeper Alpha 1: SweeperA Visual Studio Add-in for C# Code Formatting - Visual Studio 2008 Includes: A UI for options, Enable or disable any specific task you want ...T-Shirts Online: 1.0: First release of the online shop.Twilio Server Library for .NET (TSL.NET): v0.1.0 Beta: This is the first release of TSL.NET. This v0.1.0 release is a Beta. Subsequent builds will be posted as v0.1.x and release-candidate Betas will be...Vr30 OS: Facebook 1.0: Connect you to Facebook without your web browser.Vr30 OS: SkyBlog 1.0: SkyBlog without web browser.Vr30 OS: YouTube 1.0: Youtube without web browserWeb Image Resize Handler: Web Image Resize, Zoom, Rotate and Greyscale v.1.0: Efficient Web Image Resize, Zoom, Rotate and Greyscale cacheing handler for ASP.Net.WinXound: WinXound 3.3.0 Beta 1 for Mac OsX: This is the first Beta release for Apple Mac OsX (Universal Binary). DEBUG HELP NEEDED ! Please signal bugs, suggestions or feedback to: stefano_b...Most Popular ProjectsMetaSharpRawrWBFS ManagerASP.NET Ajax LibraryMicrosoft SQL Server Product Samples: DatabaseSilverlight ToolkitAJAX Control ToolkitLiveUpload to FacebookWindows Presentation Foundation (WPF)ASP.NETMost Active ProjectsRawrjQuery Library for SharePoint Web ServicesManaged Extensibility FrameworkBlogEngine.NETMicrosoft Biology Foundationpatterns & practices: Composite WPF and SilverlightLINQ to TwitterFarseer Physics EngineTable2ClassNB_Store - Free DotNetNuke Ecommerce Catalog Module

    Read the article

  • Prevent your Silverlight XAP file from caching in your browser.

    - by mbcrump
    If you work with Silverlight daily then you have run into this problem. Your XAP file has been cached in your browser and you have to empty your browser cache to resolve it. If your using Google Chrome then you typically do the following: Go to Options –> Clear Browsing History –> Empty the Cache and finally click Clear Browsing data. As you can see, this is a lot of unnecessary steps. It is even worse when you have a customer that says, “I can’t see the new features you just implemented!” and you realize it’s a cached xap problem.  I have been struggling with a way to prevent my XAP file from caching inside of a browser for a while now and decided to implement the following solution. If the Visual Studio Debugger is attached then add a unique query string to the source param to force the XAP file to be refreshed. If the Visual Studio Debugger is not attached then add the source param as Visual Studio generates it. This is also in case I forget to remove the above code in my production environment. I want the ASP.NET code to be inline with my .ASPX page. (I do not want a separate code behind .cs page or .vb page attached to the .aspx page.) Below is an example of the hosting code generated when you create a new Silverlight project. As a quick refresher, the hard coded param name = “source” specifies the location of your XAP file.  <form id="form1" runat="server" style="height:100%"> <div id="silverlightControlHost"> <object data="data:application/x-silverlight-2," type="application/x-silverlight-2" width="100%" height="100%"> <param name="source" value="ClientBin/SilverlightApplication2.xap"/> <param name="onError" value="onSilverlightError" /> <param name="background" value="white" /> <param name="minRuntimeVersion" value="4.0.50826.0" /> <param name="autoUpgrade" value="true" /> <a href="http://go.microsoft.com/fwlink/?LinkID=149156&v=4.0.50826.0" style="text-decoration:none"> <img src="http://go.microsoft.com/fwlink/?LinkId=161376" alt="Get Microsoft Silverlight" style="border-style:none"/> </a> </object><iframe id="_sl_historyFrame" style="visibility:hidden;height:0px;width:0px;border:0px"></iframe></div> </form> We are going to use a little bit of inline ASP.NET to generate the param name = source dynamically to prevent the XAP file from caching. Lets look at the completed solution: <form id="form1" runat="server" style="height:100%"> <div id="silverlightControlHost"> <object data="data:application/x-silverlight-2," type="application/x-silverlight-2" width="100%" height="100%"> <% string strSourceFile = @"ClientBin/SilverlightApplication2.xap"; string param; if (System.Diagnostics.Debugger.IsAttached) //Debugger Attached - Refresh the XAP file. param = "<param name=\"source\" value=\"" + strSourceFile + "?" + DateTime.Now.Ticks + "\" />"; else { //Production Mode param = "<param name=\"source\" value=\"" + strSourceFile + "\" />"; } Response.Write(param); %> <param name="onError" value="onSilverlightError" /> <param name="background" value="white" /> <param name="minRuntimeVersion" value="4.0.50826.0" /> <param name="autoUpgrade" value="true" /> <a href="http://go.microsoft.com/fwlink/?LinkID=149156&v=4.0.50826.0" style="text-decoration:none"> <img src="http://go.microsoft.com/fwlink/?LinkId=161376" alt="Get Microsoft Silverlight" style="border-style:none"/> </a> </object><iframe id="_sl_historyFrame" style="visibility:hidden;height:0px;width:0px;border:0px"></iframe></div> </form> We add the location to our XAP file to strSourceFile and if the debugger is attached then it will append DateTime.Now.Ticks to the XAP file source and force the browser to download the .XAP. If you view the page source of your Silverlight Application then you can verify it worked properly by looking at the param name = “source” tag as shown below. <param name="source" value="ClientBin/SilverlightApplication2.xap?634299001187160148" /> If the debugger is not attached then it will use the standard source tag as shown below. <param name="source" value="ClientBin/SilverlightApplication2.xap"/> At this point you may be asking, How do I prevent my XAP file from being cached on my production app? Well, you have two easy options: 1) I really don’t recommend this approach but you can force the XAP to be refreshed everytime with the following code snippet.  <param name="source" value="ClientBin/SilverlightApplication2.xap?<%=Guid.NewGuid().ToString() %>"/> NOTE: You could also substitute the “Guid.NewGuid().ToString() for anything that create a random field. (I used DateTime.Now.Ticks earlier). 2) Another solution that I like even better involves checking the XAP Creation Date and appending it to the param name = source. This method was described by Lars Holm Jenson. <% string strSourceFile = @"ClientBin/SilverlightApplication2.xap"; string param; if (System.Diagnostics.Debugger.IsAttached) param = "<param name=\"source\" value=\"" + strSourceFile + "\" />"; else { string xappath = HttpContext.Current.Server.MapPath(@"") + @"\" + strSourceFile; DateTime xapCreationDate = System.IO.File.GetLastWriteTime(xappath); param = "<param name=\"source\" value=\"" + strSourceFile + "?ignore=" + xapCreationDate.ToString() + "\" />"; } Response.Write(param); %> As you can see, this problem has been solved. It will work with all web browsers and stubborn proxy servers that are caching your .XAP. If you enjoyed this article then check out my blog for others like this. You may also want to subscribe to my blog or follow me on Twitter.   Subscribe to my feed

    Read the article

  • Microsoft Business Intelligence Seminar 2011

    - by DavidWimbush
    I was lucky enough to attend the maiden presentation of this at Microsoft Reading yesterday. It was pretty gripping stuff not only because of what was said but also because of what could only be hinted at. Here's what I took away from the day. (Disclaimer: I'm not a BI guru, just a reasonably experienced BI developer, so I may have misunderstood or misinterpreted a few things. Particularly when so much of the talk was about the vision and subtle hints of what is coming. Please comment if you think I've got anything wrong. I'm also not going to even try to cover Master Data Services as I struggled to imagine how you would actually use it.) I was a bit worried when I learned that the whole day was going to be presented by one guy but Rafal Lukawiecki is a very engaging speaker. He's going to be presenting this about 20 times around the world over the coming months. If you get a chance to hear him speak, I say go for it. No doubt some of the hints will become clearer as Denali gets closer to RTM. Firstly, things are definitely happening in the SQL Server Reporting and BI world. Traditionally IT would build a data warehouse, then cubes on top of that, and then publish them in a structured and controlled way. But, just as with many IT projects in general, by the time it's finished the business has moved on and the system no longer meets their requirements. This not sustainable and something more agile is needed but there has to be some control. Apparently we're going to be hearing the catchphrase 'Balancing agility with control' a lot. More users want more access to more data. Can they define what they want? Of course not, but they'll recognise it when they see it. It's estimated that only 28% of potential BI users have meaningful access to the data they need, so there is a real pent-up demand. The answer looks like: give them some self-service tools so they can experiment and see what works, and then IT can help to support the results. It's estimated that 32% of Excel users are comfortable with its analysis tools such as pivot tables. It's the power user's preferred tool. Why fight it? That's why PowerPivot is an Excel add-in and that's why they released a Data Mining add-in for it as well. It does appear that the strategy is going to be to use Reporting Services (in SharePoint mode), PowerPivot, and possibly something new (smiles and hints but no details) to create reports and explore data. Everything will be published and managed in SharePoint which gives users the ability to mash-up, share and socialise what they've found out. SharePoint also gives IT tools to understand what people are looking at and where to concentrate effort. If PowerPivot report X becomes widely used, it's time to check that it shows what they think it does and perhaps get it a bit more under central control. There was more SharePoint detail that went slightly over my head regarding where Excel Services and Excel Web Application fit in, the differences between them, and the suggestion that it is likely they will one day become one (but not in the immediate future). That basic pattern is set to be expanded upon by further exploiting Vertipaq (the columnar indexing engine that enables PowerPivot to store and process a lot of data fast and in a small memory footprint) to provide scalability 'from the desktop to the data centre', and some yet to be detailed advances in 'frictionless deployment' (part of which is about making the difference between local and the cloud pretty much irrelevant). Excel looks like becoming Microsoft's primary BI client. It already has: the ability to consume cubes strong visualisation tools slicers (which are part of Excel not PowerPivot) a data mining add-in PowerPivot A major hurdle for self-service BI is presenting the data in a consumable format. You can't just give users PowerPivot and a server with a copy of the OLTP database(s). Building cubes is labour intensive and doesn't always give the user what they need. This is where the BI Semantic Model (BISM) comes in. I gather it's a layer of metadata you define that can combine multiple data sources (and types of data source) into a clear 'interface' that users can work with. It comes with a new query language called DAX. SSAS cubes are unlikely to go away overnight because, with their pre-calculated results, they are still the most efficient way to work with really big data sets. A few other random titbits that came up: Reporting Services is going to get some good new stuff in Denali. Keep an eye on www.projectbotticelli.com for the slides. You can also view last year's seminar sessions which covered a lot of the same ground as far as the overall strategy is concerned. They plan to add more material as Denali's features are publicly exposed. Check out the PASS keynote address for a showing of Yahoo's SQL BI servers. Apparently they wheeled the rack out on stage still plugged in and running! Check out the Excel 2010 Data Mining Add-Ins. 32 bit only at present but 64 bit is on the way. There are lots of data sets, many of them free, at the Windows Azure Marketplace Data Market (where you can also get ESRI shape files). If you haven't already seen it, have a look at the Silverlight Pivot Viewer (http://weblogs.asp.net/scottgu/archive/2010/06/29/silverlight-pivotviewer-now-available.aspx). The Bing Maps Data Connector is worth a look if you're into spatial stuff (http://www.bing.com/community/site_blogs/b/maps/archive/2010/07/13/data-connector-sql-server-2008-spatial-amp-bing-maps.aspx).  

    Read the article

  • April Oracle Database Events

    - by Mandy Ho
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif";} April 17-18, 2012 – Moscow, Russia Oracle Develop Conference The Oracle database developer conference, Oracle Develop, will visit Moscow, Russia and Hyderabad, India this spring. Oracle Develop includes a database development track, which contains .NET sessions and hands-on lab led by an Oracle .NET product manager. Register today before the event fills up. http://www.oracle.com/javaone/ru-en/index.html Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif";} April 24, 26 – San Diego, CA & San Jose, CA ISC2 Leadership Regional Event Series Oracle at (ISC)2 Security Leadership Series: Herding Clouds -- Managing Cloud Security Concerns and Expectations Join us for his interactive day-long session where industry leaders, including Oracle solution experts, will talk about how to: dispel the top Cloud security myths minimize "rogue Cloud" implementations identify potential compliance pitfalls and how to avoid them develop contract language for your cloud providers manage users across your fractured datacenter leverage existing technologies to protect data as it moves from the enterprise to the cloud http://www.oracle.com/webapps/events/ns/EventsDetail.jsp?p_eventId=146972&src=7239493&src=7239493&Act=228 Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif";} April 22-26, 2012 – Las Vegas, NV IOUG Collaborate 12 From April 22-26, 2012, Oracle takes Las Vegas. Thousands of Oracle professionals will descend upon the Mandalay Bay Convention Center for a weeks worth of education sessions, networking opportunities and more, at the only user-driven and user-run Oracle conference - COLLABORATE 12.  Your COLLABORATE 12 - IOUG Forum registration comes complete with a bonus- a full day Deep Dive education program on Sunday! Choose from numerous hot topics, including Real World Performance, High Availability and more, presented by legendary and seasoned Oracle pros, including Tom Kyte and Craig Shallahamer. http://www.oracle.com/webapps/events/ns/EventsDetail.jsp?p_eventId=143637&src=7360364&src=7360364&Act=5 Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif";} Independent Oracle User Group (IOUG) Regional Events: April 16, 2012 – Columbus, Ohio Ohio Oracle Users Group Higher Performance PL/PQL and Oracle Database 11g PL/SQL New Features – Featuring Steven Feuerstein http://www.ooug.org/ Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif";} Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif";} April 26, 2012- Irving, TX Dallas Oracle Users Group Oracle Database Forum: 5-7PM Oracle Corporation, 6031 Connection Drive Irving, TX http://memberservices.membee.com/538/irmevents.aspx?id=64 Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif";} Apr 30, 2012- Los Angeles, CA Real World Performance Tour A full day of real world database performance with: Tom Kyte, author of the ever popular AskTom Blog, Andrew Holdsworth, head of Oracle's Real World Performance Team, and Graham Wood, renowned Oracle Database performance architect. Through discussion, debate and demos, they’ll show you how to master performance engineering topics like: Best practices for designing hardware architectures and how to spot and fix bad design. How to develop applications that deliver the fastest possible performance without sacrificing accuracy.  New for 2012! Updates on Enterprise Manager, Exadata, and what these technologies mean to your current systems. http://www.ioug.org/Events/ADayofRealWorldPerformance/tabid/194/Default.aspx

    Read the article

  • Sorting the columns of an HTML table using JQuery

    - by nikolaosk
    In this post I will show you how easy is to sort the columns of an HTML table. I will use an external library,called Tablesorter which makes life so much easier for developers. ?here are other posts in my blog regarding JQuery.You can find them all here. You can find another post regarding HTML tables and JQuery here. We will demonstrate this with a step by step example. I will use Visual Studio 2012 Ultimate. You can also use Visual Studio 2012 Express Edition. You can also use VS 2010 editions.   1) Launch Visual Studio. Create an ASP.Net Empty Web application. Choose an appropriate name for your application. 2) Add a web form, default.aspx page to the application. 3) Add a table from the HTML controls tab control (from the Toolbox) on the default.aspx page 4) Now we need to download the JQuery library. Please visit the http://jquery.com/ and download the minified version.Then we need to download the Tablesorter JQuery plugin. Please donwload it, here. 5) We need to reference the JQuery library and the external JQuery Plugin. In the head section ? add the following lines.   <script src="jquery-1_8_2_min.js" type="text/javascript"></script>  <script src="jquery.tablesorter.js" type="text/javascript"></script>6) We need to type the HTML markup, the HTML table and its columns <body>    <form id="form1" runat="server">    <div>        <h1>Liverpool Legends</h1>        <table style="width: 50%;" border="1" cellpadding="10" cellspacing ="10" class="liverpool">            <thead>                <tr><th>Defenders</th><th>MidFielders</th><th>Strikers</th></tr>            </thead>            <tbody>            <tr>                <td>Alan Hansen</td>                <td>Graeme Souness</td>                <td>Ian Rush</td>            </tr>            <tr>                <td>Alan Kennedy</td>                <td>Steven Gerrard</td>                <td>Michael Owen</td>            </tr>            <tr>                <td>Jamie Garragher</td>                <td>Kenny Dalglish</td>                <td>Robbie Fowler</td>            </tr>            <tr>                <td>Rob Jones</td>                <td>Xabi Alonso</td>                <td>Dirk Kuyt</td>            </tr>                </tbody>        </table>            </div>    </form></body> 7) Inside the head section we also write the simple JQuery code.   <script type="text/javascript"> $(document).ready(function() { $('.liverpool').tablesorter(); }); </script> 8) Run your application.This is how the HTML table looks before the table is sorted on the basis of the selected column.   9) Now I will click on the Midfielders header.Have a look at the picture below  Tablesorter is an excellent JQuery plugin that makes sorting HTML tables a piece of cake. Hope it helps!!!

    Read the article

  • Free Document/Content Management System Using SharePoint 2010

    - by KunaalKapoor
    That’s right, it’s true. You can use the free version of SharePoint 2010 to meet your document and content management needs and even run your public facing website or an internal knowledge bank.  SharePoint Foundation 2010 is free. It may not have all the features that you get in the enterprise license but it still has enough to cater to your needs to build a document management system and replace age old file shares or folders. I’ve built a dozen content management sites for internal and public use exploiting SharePoint. There are hundreds of web content management systems out there (see CMS Matrix).  On one hand we have commercial platforms like SharePoint, SiteCore, and Ektron etc. which are the most frequently used and on the other hand there are free options like WordPress, Drupal, Joomla, and Plone etc. which are pretty common popular as well. But I would be very surprised if anyone was able to find a single CMS platform that is all things to all people. Infact not a lot of people consider SharePoint’s free version under the free CMS side but its high time organizations benefit from this. Through this blog post I wanted to present SharePoint Foundation as an option for running a FREE CMS platform. Even if you knew that there is a free version of SharePoint, what most people don’t realize is that SharePoint Foundation is a great option for running web sites of all kinds – not just team sites. It is a great option for many reasons, but in reality it is supported by Microsoft, and above all it is FREE (yay!), and it is extremely easy to get started.  From a functionality perspective – it’s hard to beat SharePoint. Even the free version, SharePoint Foundation, offers simple data connectivity (through BCS), cross browser support, accessibility, support for Office Web Apps, blogs, wikis, templates, document support, health analyzer, support for presence, and MUCH more.I often get asked: “Can I use SharePoint 2010 as a document management system?” The answer really depends on ·          What are your specific requirements? ·          What systems you currently have in place for managing documents. ·          And of course how much money you have J Benefits? Not many large organizations have benefited from SharePoint yet. For some it has been an IT project to see what they can achieve with it, for others it has been used as a collaborative platform or in many cases an extended intranet. SharePoint 2010 has changed the game slightly as the improvements that Microsoft have made have been noted by organizations, and we are seeing a lot of companies starting to build specific business applications using SharePoint as the basis, and nearly every business process will require documents at some stage. If you require a document management system and have SharePoint in place then it can be a relatively straight forward decision to use SharePoint, as long as you have reviewed the considerations just discussed. The collaborative nature of SharePoint 2010 is also a massive advantage, as specific departmental or project sites can be created quickly and easily that allow workers to interact in a variety of different ways using one source of information.  This also benefits an organization with regards to how they manage the knowledge that they have, as if all of their information is in one source then it is naturally easier to search and manage. Is SharePoint right for your organization? As just discussed, this can only be determined after defining your requirements and also planning a longer term strategy for how you will manage your documents and information. A key factor to look at is how the users would interact with the system and how much value would it get for your organization. The amount of data and documents that organizations are creating is increasing rapidly each year. Therefore the ability to archive this information, whilst keeping the ability to know what you have and where it is, is vital to any organizations management of their information life cycle. SharePoint is best used for the initial life of business documents where they need to be referenced and accessed after time. It is often beneficial to archive these to overcome for storage and performance issues. FREE CMS – SharePoint, Really? In order to show some of the completely of what comes with this free version of SharePoint 2010, I thought it would make sense to use Wikipedia (since every one trusts it as a credible source). Wikipedia shows that a web content management system typically has the following components: Document Management:   -       CMS software may provide a means of managing the life cycle of a document from initial creation time, through revisions, publication, archive, and document destruction. SharePoint is king when it comes to document management.  Version history, exclusive check-out, security, publication, workflow, and so much more.  Content Virtualization:   -       CMS software may provide a means of allowing each user to work within a virtual copy of the entire Web site, document set, and/or code base. This enables changes to multiple interdependent resources to be viewed and/or executed in-context prior to submission. Through the use of versioning, each content manager can preview, publish, and roll-back content of pages, wiki entries, blog posts, documents, or any other type of content stored in SharePoint.  The idea of each user having an entire copy of the website virtualized is a bit odd to me – not sure why anyone would need that for anything but the simplest of websites. Automated Templates:   -       Create standard output templates that can be automatically applied to new and existing content, allowing the appearance of all content to be changed from one central place. Through the use of Master Pages and Themes, SharePoint provides the ability to change the entire look and feel of site.  Of course, the older brother version of SharePoint – SharePoint Server 2010 – also introduces the concept of Page Layouts which allows page template level customization and even switching the layout of an individual page using different page templates.  I think many organizations really think they want this but rarely end up using this bit of functionality.  Easy Edits:   -       Once content is separated from the visual presentation of a site, it usually becomes much easier and quicker to edit and manipulate. Most WCMS software includes WYSIWYG editing tools allowing non-technical individuals to create and edit content. This is probably easier described with a screen cap of a vanilla SharePoint Foundation page in edit mode.  Notice the page editing toolbar, the multiple layout options…  It’s actually easier to use than Microsoft Word. Workflow management: -       Workflow is the process of creating cycles of sequential and parallel tasks that must be accomplished in the CMS. For example, a content creator can submit a story, but it is not published until the copy editor cleans it up and the editor-in-chief approves it. Workflow, it’s in there. In fact, the same workflow engine is running under SharePoint Foundation that is running under the other versions of SharePoint.  The primary difference is that with SharePoint Foundation – you need to configure the workflows yourself.   Web Standards: -       Active WCMS software usually receives regular updates that include new feature sets and keep the system up to current web standards. SharePoint is in the fourth major iteration under Microsoft with the 2010 release.  In addition to the innovation that Microsoft continuously adds, you have the entire global ecosystem available. Scalable Expansion:   -       Available in most modern WCMSs is the ability to expand a single implementation (one installation on one server) across multiple domains. SharePoint Foundation can run multiple sites using multiple URLs on a single server install.  Even more powerful, SharePoint Foundation is scalable and can be part of a multi-server farm to ensure that it will handle any amount of traffic that can be thrown at it. Delegation & Security:  -       Some CMS software allows for various user groups to have limited privileges over specific content on the website, spreading out the responsibility of content management. SharePoint Foundation provides very granular security capabilities. Read @ http://msdn.microsoft.com/en-us/library/ee537811.aspx Content Syndication:  -       CMS software often assists in content distribution by generating RSS and Atom data feeds to other systems. They may also e-mail users when updates are available as part of the workflow process. SharePoint Foundation nails it.  With RSS syndication and email alerts available out of the box, content syndication is already in the platform. Multilingual Support: -       Ability to display content in multiple languages. SharePoint Foundation 2010 supports more than 40 languages. Read More Read more @ http://msdn.microsoft.com/en-us/library/dd776256(v=office.12).aspxYou can download the free version from http://www.microsoft.com/en-us/download/details.aspx?id=5970

    Read the article

  • Benefits of Behavior Driven Development

    - by Aligned
    Originally posted on: http://geekswithblogs.net/Aligned/archive/2013/07/26/benefits-of-behavior-driven-development.aspxContinuing my previous article on BDD, I wanted to point out some benefits of BDD and since BDD is an extension of Test Driven Development (TDD), you get those as well. I’ll add another article on some possible downsides of this approach. There are many articles about the benefits of TDD and they apply to BDD. I’ve pointed out some here and copied some of the main points for each article, but there are many more including the book The Art of Unit Testing by Roy Osherove. http://geekswithblogs.net/leesblog/archive/2008/04/30/the-benefits-of-test-driven-development.aspx (Lee Brandt) Stability Accountability Design Ability Separated Concerns Progress Indicator http://tddftw.com/benefits-of-tdd/ Help maintainers understand the intention behind the code Bring validation and proper data handling concerns to the forefront. Writing the tests first is fun. Better APIs come from writing testable code. TDD will make you a better developer. http://www.slideshare.net/dhelper/benefit-from-unit-testing-in-the-real-world (from Typemock). Take a look at the slides, especially the extra time required for TDD (slide 10) and the next one of the bugs avoided using TDD (slide 11). Less bugs (slide 11) about testing and development (13) Increase confidence in code (14) Fearlessly change your code (14) Document Requirements (14) also see http://visualstudiomagazine.com/articles/2013/06/01/roc-rocks.aspx Discover usability issues early (14) All these points and articles are great and there are many more. The following are my additions to the benefits of BDD from using it in real projects for my company. July 2013 on MSDN - Behavior-Driven Design with SpecFlow Scott Allen did a very informative TDD and MVC module, but to me he is doing BDDCompile and Execute Requirements in Microsoft .NET ~ Video from TechEd 2012 Communication I was working through a complicated task that the decision tree kept growing. After writing out the Given, When, Then of the scenario, I was able tell QA what I had worked through for their initial test cases. They were able to add from there. It is also useful to use this language with other developers, managers, or clients to help make informed decisions on if it meets the requirements or if it can simplified to save time (money). Thinking through solutions, before starting to code This was the biggest benefit to me. I like to jump into coding to figure out the problem. Many times I don't understand my path well enough and have to do some parts over. A past supervisor told me several times during reviews that I need to get better at seeing "the forest for the trees". When I sit down and write out the behavior that I need to implement, I force myself to think things out further and catch scenarios before they get to QA. A co-worker that is new to BDD and we’ve been using it in our new project for the last 6 months, said “It really clarifies things”. It took him awhile to understand it all, but now he’s seeing the value of this approach (yes there are some downsides, but that is a different issue). Developers’ Confidence This is huge for me. With tests in place, my confidence grows that I won’t break code that I’m not directly changing. In the past, I’ve worked on projects with out tests and we would frequently find regression bugs (or worse the users would find them). That isn’t fun. We don’t catch all problems with the tests, but when QA catches one, I can write a test to make sure it doesn’t happen again. It’s also good for Releasing code, telling your manager that it’s good to go. As time goes on and the code gets older, how confident are you that checking in code won’t break something somewhere else? Merging code - pre release confidence If you’re merging code a lot, it’s nice to have the tests to help ensure you didn’t merge incorrectly. Interrupted work I had a task that I started and planned out, then was interrupted for a month because of different priorities. When I started it up again, and un-shelved my changes, I had the BDD specs and it helped me remember what I had figured out and what was left to do. It would have much more difficult without the specs and tests. Testing and verifying complicated scenarios Sometimes in the UI there are scenarios that get tricky, because there are a lot of steps involved (click here to open the dialog, enter the information, make sure it’s valid, when I click cancel it should do {x}, when I click ok it should close and do {y}, then do this, etc….). With BDD I can avoid some of the mouse clicking define the scenarios and have them re-run quickly, without using a mouse. UI testing is still needed, but this helps a bunch. The same can be true for tricky server logic. Documentation of Assumptions and Specifications The BDD spec tests (Jasmine or SpecFlow or other tool) also work as documentation and show what the original developer was trying to accomplish. It’s not a different Word document, so developers will keep this up to date, instead of letting it become obsolete. What happens if you leave the project (consulting, new job, etc) with no specs or at the least good comments in the code? Sometimes I think of a new scenario, so I add a failing spec and continue in the same stream of thought (don’t forget it because it was on a piece of paper or in a notepad). Then later I can come back and handle it and have it documented. Jasmine tests and JavaScript –> help deal with the non-typed system I like JavaScript, but I also dislike working with JavaScript. I miss C# telling me if a property doesn’t actually exist at build time. I like the idea of TypeScript and hope to use it more in the future. I also use KnockoutJs, which has observables that need to be called with ending (), since the observable is a function. It’s hard to remember when to use () or not and the Jasmine specs/tests help ensure the correct usage.   This should give you an idea of the benefits that I see in using the BDD approach. I’m sure there are more. It talks a lot of practice, investment and experimentation to figure out how to approach this and to get comfortable with it. I agree with Scott Allen in the video I linked above “Remember that TDD can take some practice. So if you're not doing test-driven design right now? You can start and practice and get better. And you'll reach a point where you'll never want to get back.”

    Read the article

  • Notes on implementing Visual Studio 2010 Navigate To

    - by cyberycon
    One of the many neat functions added to Visual Studio in VS 2010 was the Navigate To feature. You can find it by clicking Edit, Navigate To, or by using the keyboard shortcut Ctrl, (yes, that's control plus the comma key). This pops up the Navigate To dialog that looks like this: As you type, Navigate To starts searching through a number of different search providers for your term. The entries in the list change as you type, with most providers doing some kind of fuzzy or at least substring matching. If you have C#, C++ or Visual Basic projects in your solution, all symbols defined in those projects are searched. There's also a file search provider, which displays all matching filenames from projects in the current solution as well. And, if you have a Visual Studio package of your own, you can implement a provider too. Micro Focus (where I work) provide the Visual COBOL language inside Visual Studio (http://visualstudiogallery.msdn.microsoft.com/ef9bc810-c133-4581-9429-b01420a9ea40 ), and we wanted to provide this functionality too. This post provides some notes on the things I discovered mainly through trial and error, but also with some kind help from devs inside Microsoft. The expectation of Navigate To is that it searches across the whole solution, not just the current project. So in our case, we wanted to search for all COBOL symbols inside all of our Visual COBOL projects inside the solution. So first of all, here's the Microsoft documentation on Navigate To: http://msdn.microsoft.com/en-us/library/ee844862.aspx . It's the reference information on the Microsoft.VisualStudio.Language.NavigateTo.Interfaces Namespace, and it lists all the interfaces you will need to implement to create your own Navigate To provider. Navigate To uses Visual Studio's latest mechanism for integrating external functionality and services, Managed Extensibility Framework (MEF). MEF components don't require any registration with COM or any other registry entries to be found by Visual Studio. Visual Studio looks in several well-known locations for manifest files (extension.vsixmanifest). It then uses reflection to scan for MEF attributes on classes in the assembly to determine which functionality the assembly provides. MEF itself is actually part of the .NET framework, and you can learn more about it here: http://mef.codeplex.com/. To get started with Visual Studio and MEF you could do worse than look at some of the editor examples on the VSX page http://archive.msdn.microsoft.com/vsx . I've also written a small application to help with switching between development and production MEF assemblies, which you can find on Codeproject: http://www.codeproject.com/KB/miscctrl/MEF_Switch.aspx. The Navigate To interfaces Back to Navigate To, and summarizing the MSDN reference documentation, you need to implement the following interfaces: INavigateToItemProviderFactoryThis is Visual Studio's entry point to your Navigate To implementation, and you must decorate your implementation with the following MEF export attribute: [Export(typeof(INavigateToItemProviderFactory))]  INavigateToItemProvider Your INavigateToItemProviderFactory needs to return your implementation of INavigateToItemProvider. This class implements StartSearch() and StopSearch(). StartSearch() is the guts of your provider, and we'll come back to it in a minute. This object also needs to implement IDisposeable(). INavigateToItemDisplayFactory Your INavigateToItemProvider hands back NavigateToItems to the NavigateTo framework. But to give you good control over what appears in the NavigateTo dialog box, these items will be handed back to your INavigateToItemDisplayFactory, which must create objects implementing INavigateToItemDisplay  INavigateToItemDisplay Each of these objects represents one result in the Navigate To dialog box. As well as providing the description and name of the item, this object also has a NavigateTo() method that should be capable of displaying the item in an editor when invoked. Carrying out the search The lifecycle of your INavigateToItemProvider is the same as that of the Navigate To dialog. This dialog is modal, which makes your implementation a little easier because you know that the user can't be changing things in editors and the IDE while this dialog is up. But the Navigate To dialog DOES NOT run on the main UI thread of the IDE – so you need to be aware of that if you want to interact with editors or other parts of the IDE UI. When the user invokes the Navigate To dialog, your INavigateToItemProvider gets sent a TryCreateNavigateToItemProvider() message. Instantiate your INavigateToItemProvider and hand this back. The sequence diagram below shows what happens next. Your INavigateToItemProvider will get called with StartSearch(), and passed an INavigateToCallback. StartSearch() is an asynchronous request – you must return from this method as soon as possible, and conduct your search on a separate thread. For each match to the search term, instantiate a NavigateToItem object and send it to INavigateToCallback.AddItem(). But as the user types in the Search Terms field, NavigateTo will invoke your StartSearch() method repeatedly with the changing search term. When you receive the next StartSearch() message, you have to abandon your current search, and start a new one. You can't rely on receiving a StopSearch() message every time. Finally, when the Navigate To dialog box is closed by the user, you will get a Dispose() message – that's your cue to abandon any uncompleted searches, and dispose any resources you might be using as part of your search. While you conduct your search invoke INavigateToCallback.ReportProgress() occasionally to provide feedback about how close you are to completing the search. There does not appear to be any particular requirement to how often you invoke ReportProgress(), and you report your progress as the ratio of two integers. In my implementation I report progress in terms of the number of symbols I've searched over the total number of symbols in my dictionary, and send a progress report every 16 symbols. Displaying the Results The Navigate to framework invokes INavigateToItemDisplayProvider.CreateItemDisplay() once for each result you passed to the INavigateToCallback. CreateItemDisplay() is passed the NavigateToItem you handed to the callback, and must return an INavigateToItemDisplay object. NavigateToItem is a sealed class which has a few properties, including the name of the symbol. It also has a Tag property, of type object. This enables you to stash away all the information you will need to create your INavigateToItemDisplay, which must implement an INavigateTo() method to display a symbol in an editor IDE when the user double-clicks an entry in the Navigate To dialog box. Since the tag is of type object, it is up to you, the implementor, to decide what kind of object you store in here, and how it enables the retrieval of other information which is not included in the NavigateToItem properties. Some of the INavigateToItemDisplay properties are self-explanatory, but a couple of them are less obvious: Additional informationThe string you return here is displayed inside brackets on the same line as the Name property. In English locales, Visual Studio includes the preposition "of". If you look at the first line in the Navigate To screenshot at the top of this article, Book_WebRole.Default is the additional information for textBookAuthor, and is the namespace qualified type name the symbol appears in. For procedural COBOL code we display the Program Id as the additional information DescriptionItemsYou can use this property to return any textual description you want about the item currently selected. You return a collection of DescriptionItem objects, each of which has a category and description collection of DescriptionRun objects. A DescriptionRun enables you to specify some text, and optional formatting, so you have some control over the appearance of the displayed text. The DescriptionItems property is displayed at the bottom of the Navigate To dialog box, with the Categories on the left and the Descriptions on the right. The Visual COBOL implementation uses it to display more information about the location of an item, making it easier for the user to know disambiguate duplicate names (something there can be a lot of in large COBOL applications). Summary I hope this article is useful for anyone implementing Navigate To. It is a fantastic navigation feature that Microsoft have added to Visual Studio, but at the moment there still don't seem to be any examples on how to implement it, and the reference information on MSDN is a little brief for anyone attempting an implementation.

    Read the article

  • Generating radial indicator images using C#

    - by DigiMortal
    In one of my projects I needed to draw radial indicators for processes measured in percent. Simple images like the one shown on right. I solved the problem by creating images in C# and saving them on server hard disc so if image is once generated then it is returned from disc next time. I am not master of graphics or geometrics but here is the code I wrote. Drawing radial indicator To get things done quick’n’easy way – later may some of younger developers be the one who may need to changes things – I divided my indicator drawing process to four steps shown below. 1. Fill pie 2. Draw circles 3. Fill inner circle 4. Draw text Drawing image Here is the code to draw indicators. private static void SaveRadialIndicator(int percent, string filePath) {     using (Bitmap bitmap = new Bitmap(100, 100))     using (Graphics objGraphics = Graphics.FromImage(bitmap))     {         // Initialize graphics         objGraphics.Clear(Color.White);         objGraphics.SmoothingMode = SmoothingMode.AntiAlias;         objGraphics.TextRenderingHint = TextRenderingHint.ClearTypeGridFit;           // Fill pie         // Degrees are taken clockwise, 0 is parallel with x         // For sweep angle we must convert percent to degrees (90/25 = 18/5)         float startAngle = -90.0F;                        float sweepAngle = (18.0F / 5) * percent;           Rectangle rectangle = new Rectangle(5, 5, 90, 90);         objGraphics.FillPie(Brushes.Orange, rectangle, startAngle, sweepAngle);           // Draw circles         rectangle = new Rectangle(5, 5, 90, 90);         objGraphics.DrawEllipse(Pens.LightGray, rectangle);         rectangle = new Rectangle(20, 20, 60, 60);         objGraphics.DrawEllipse(Pens.LightGray, rectangle);           // Fill inner circle with white         rectangle = new Rectangle(21, 21, 58, 58);         objGraphics.FillEllipse(Brushes.White, rectangle);           // Draw text on image         // Use rectangle for text and align text to center of rectangle         var font = new Font("Arial", 13, FontStyle.Bold);         StringFormat stringFormat = new StringFormat();         stringFormat.Alignment = StringAlignment.Center;         stringFormat.LineAlignment = StringAlignment.Center;           rectangle = new Rectangle(20, 40, 62, 20);         objGraphics.DrawString(percent + "%", font, Brushes.DarkGray, rectangle, stringFormat);           // Save indicator to file         objGraphics.Flush();         if (File.Exists(filePath))             File.Delete(filePath);           bitmap.Save(filePath, ImageFormat.Png);     }        } Using indicators on web page To show indicators on your web page you can use the following code on page that outputs indicator images: protected void Page_Load(object sender, EventArgs e) {     var percentString = Request.QueryString["percent"];     var percent = 0;     if(!int.TryParse(percentString, out percent))         return;     if(percent < 0 || percent > 100)         return;       var file = Server.MapPath("~/images/percent/" + percent + ".png");     if(!File.Exists(file))         SaveImage(percent, file);       Response.Clear();     Response.ContentType = "image/png";     Response.WriteFile(file);     Response.End(); } Om your pages where you need indicator you can set image source to Indicator.aspx (if you named your indicator handling file like this) and add percent as query string:     <img src="Indicator.aspx?percent=30" /> That’s it! If somebody knows simpler way how to generate indicators like this I am interested in your feedback.

    Read the article

  • Create Resume problem

    - by ar31an
    hello mates, i am working on a project of online resume management system and i am encountering an exception while creating resume. [b] Exception: Data type mismatch in criteria expression. [/b] here is my code for Create Resume-1.aspx.cs using System; using System.Data; using System.Configuration; using System.Collections; using System.Web; using System.Web.Security; using System.Web.UI; using System.Web.UI.WebControls; using System.Web.UI.WebControls.WebParts; using System.Web.UI.HtmlControls; using System.Data.OleDb; public partial class _Default : System.Web.UI.Page { string sql, sql2, sql3, sql4, sql5, sql6, sql7, sql8, sql9, sql10, sql11, sql12; string conString = "Provider=Microsoft.ACE.OLEDB.12.0; Data Source=D:\\Deliverable4.accdb"; protected OleDbConnection rMSConnection; protected OleDbCommand rMSCommand; protected OleDbDataAdapter rMSDataAdapter; protected DataSet dataSet; protected DataTable dataTable; protected DataRow dataRow; protected void Page_Load(object sender, EventArgs e) { } protected void Button1_Click(object sender, EventArgs e) { string contact1 = TextBox1.Text; string contact2 = TextBox2.Text; string cellphone = TextBox3.Text; string address = TextBox4.Text; string city = TextBox5.Text; string addqualification = TextBox18.Text; //string SecondLastDegreeGrade = TextBox17.Text; //string SecondLastDegreeInstitute = TextBox16.Text; //string SecondLastDegreeNameOther = TextBox15.Text; string LastDegreeNameOther = TextBox11.Text; string LastDegreeInstitute = TextBox12.Text; string LastDegreeGrade = TextBox13.Text; string tentativeFromDate = (DropDownList4.SelectedValue + " " + DropDownList7.SelectedValue + " " + DropDownList8.SelectedValue); try { sql6 = "select CountryID from COUNTRY"; rMSConnection = new OleDbConnection(conString); rMSDataAdapter = new OleDbDataAdapter(sql6, rMSConnection); dataSet = new DataSet("cID"); rMSDataAdapter.Fill(dataSet, "COUNTRY"); dataTable = dataSet.Tables["COUNTRY"]; int cId = (int)dataTable.Rows[0][0]; rMSConnection.Close(); sql4 = "select PersonalDetailID from PERSONALDETAIL"; rMSConnection = new OleDbConnection(conString); rMSDataAdapter = new OleDbDataAdapter(sql4, rMSConnection); dataSet = new DataSet("PDID"); rMSDataAdapter.Fill(dataSet, "PERSONALDETAIL"); dataTable = dataSet.Tables["PERSONALDETAIL"]; int PDId = (int)dataTable.Rows[0][0]; rMSConnection.Close(); sql5 = "update PERSONALDETAIL set Phone1 ='" + contact1 + "' , Phone2 = '" + contact2 + "', CellPhone = '" + cellphone + "', Address = '" + address + "', City = '" + city + "', CountryID = '" + cId + "' where PersonalDetailID = '" + PDId + "'"; rMSConnection = new OleDbConnection(conString); rMSConnection.Open(); rMSCommand = new OleDbCommand(sql5, rMSConnection); rMSCommand.ExecuteNonQuery(); rMSConnection.Close(); sql3 = "select DesignationID from DESIGNATION"; rMSConnection = new OleDbConnection(conString); rMSDataAdapter = new OleDbDataAdapter(sql3, rMSConnection); dataSet = new DataSet("DesID"); rMSDataAdapter.Fill(dataSet, "DESIGNATION"); dataTable = dataSet.Tables["DESIGNATION"]; int desId = (int)dataTable.Rows[0][0]; rMSConnection.Close(); sql2 = "select DepartmentID from DEPARTMENT"; rMSConnection = new OleDbConnection(conString); rMSDataAdapter = new OleDbDataAdapter(sql2, rMSConnection); dataSet = new DataSet("DID"); rMSDataAdapter.Fill(dataSet, "DEPARTMENT"); dataTable = dataSet.Tables["DEPARTMENT"]; int dId = (int)dataTable.Rows[0][0]; rMSConnection.Close(); sql7 = "select ResumeID from RESUME"; rMSConnection = new OleDbConnection(conString); rMSDataAdapter = new OleDbDataAdapter(sql7, rMSConnection); dataSet = new DataSet("rID"); rMSDataAdapter.Fill(dataSet, "RESUME"); dataTable = dataSet.Tables["RESUME"]; int rId = (int)dataTable.Rows[0][0]; rMSConnection.Close(); sql = "update RESUME set PersonalDetailID ='" + PDId + "' , DesignationID = '" + desId + "', DepartmentID = '" + dId + "', TentativeFromDate = '" + tentativeFromDate + "', AdditionalQualification = '" + addqualification + "' where ResumeID = '" + rId + "'"; rMSConnection = new OleDbConnection(conString); rMSConnection.Open(); rMSCommand = new OleDbCommand(sql, rMSConnection); rMSCommand.ExecuteNonQuery(); rMSConnection.Close(); sql8 = "insert into INSTITUTE (InstituteName) values ('" + LastDegreeInstitute + "')"; rMSConnection = new OleDbConnection(conString); rMSConnection.Open(); rMSCommand = new OleDbCommand(sql8, rMSConnection); rMSCommand.ExecuteNonQuery(); rMSConnection.Close(); sql9 = "insert into DEGREE (DegreeName) values ('" + LastDegreeNameOther + "')"; rMSConnection = new OleDbConnection(conString); rMSConnection.Open(); rMSCommand = new OleDbCommand(sql9, rMSConnection); rMSCommand.ExecuteNonQuery(); rMSConnection.Close(); sql11 = "select InstituteID from INSTITUTE"; rMSConnection = new OleDbConnection(conString); rMSDataAdapter = new OleDbDataAdapter(sql11, rMSConnection); dataSet = new DataSet("insID"); rMSDataAdapter.Fill(dataSet, "INSTITUTE"); dataTable = dataSet.Tables["INSTITUTE"]; int insId = (int)dataTable.Rows[0][0]; rMSConnection.Close(); sql12 = "select DegreeID from DEGREE"; rMSConnection = new OleDbConnection(conString); rMSDataAdapter = new OleDbDataAdapter(sql12, rMSConnection); dataSet = new DataSet("degID"); rMSDataAdapter.Fill(dataSet, "DEGREE"); dataTable = dataSet.Tables["DEGREE"]; int degId = (int)dataTable.Rows[0][0]; rMSConnection.Close(); sql10 = "insert into QUALIFICATION (Grade, ResumeID, InstituteID, DegreeID) values ('" + LastDegreeGrade + "', '" + rId + "', '" + insId + "', '" + degId + "')"; rMSConnection = new OleDbConnection(conString); rMSConnection.Open(); rMSCommand = new OleDbCommand(sql10, rMSConnection); rMSCommand.ExecuteNonQuery(); rMSConnection.Close(); Response.Redirect("Applicant.aspx"); } catch (Exception exp) { rMSConnection.Close(); Label1.Text = "Exception: " + exp.Message; } } protected void Button2_Click(object sender, EventArgs e) { } } And for Create Resume-1.aspx <%@ Page Language="C#" AutoEventWireup="true" CodeFile="Create Resume-1.aspx.cs" Inherits="_Default" %> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" > <head runat="server"> <title>Untitled Page</title> </head> <body> <form id="form1" runat="server"> <div><center> <strong><span style="font-size: 16pt"></span></strong>&nbsp;</center> <center> &nbsp;</center> <center style="background-color: silver"> &nbsp;</center> <center> <strong><span style="font-size: 16pt">Step 1</span></strong></center> <center style="background-color: silver"> &nbsp;</center> <center> &nbsp;</center> <center> &nbsp;</center> <center> <asp:Label ID="PhoneNo1" runat="server" Text="Contact No 1*"></asp:Label> <asp:TextBox ID="TextBox1" runat="server"></asp:TextBox> <asp:RequiredFieldValidator ID="RequiredFieldValidator1" runat="server" ErrorMessage="Items marked with '*' cannot be left blank." ControlToValidate="TextBox1"></asp:RequiredFieldValidator><br /> <asp:Label ID="PhoneNo2" runat="server" Text="Contact No 2"></asp:Label> <asp:TextBox ID="TextBox2" runat="server"></asp:TextBox><br /> <asp:Label ID="CellNo" runat="server" Text="Cell Phone No"></asp:Label> <asp:TextBox ID="TextBox3" runat="server"></asp:TextBox><br /> <asp:Label ID="Address" runat="server" Text="Street Address*"></asp:Label> <asp:TextBox ID="TextBox4" runat="server"></asp:TextBox> <asp:RequiredFieldValidator ID="RequiredFieldValidator2" runat="server" ErrorMessage="Items marked with '*' cannot be left blank." ControlToValidate="TextBox4"></asp:RequiredFieldValidator><br /> <asp:Label ID="City" runat="server" Text="City*"></asp:Label> <asp:TextBox ID="TextBox5" runat="server"></asp:TextBox> <asp:RequiredFieldValidator ID="RequiredFieldValidator3" runat="server" ErrorMessage="Items marked with '*' cannot be left blank." ControlToValidate="TextBox5"></asp:RequiredFieldValidator><br /> <asp:Label ID="Country" runat="server" Text="Country of Origin*"></asp:Label> <asp:DropDownList ID="DropDownList1" runat="server" DataSourceID="SqlDataSource1" DataTextField="CountryName" DataValueField="CountryID"> </asp:DropDownList><asp:SqlDataSource ID="SqlDataSource1" runat="server" ConnectionString="<%$ ConnectionStrings:ConnectionString7 %>" DeleteCommand="DELETE FROM [COUNTRY] WHERE (([CountryID] = ?) OR ([CountryID] IS NULL AND ? IS NULL))" InsertCommand="INSERT INTO [COUNTRY] ([CountryID], [CountryName]) VALUES (?, ?)" ProviderName="<%$ ConnectionStrings:ConnectionString7.ProviderName %>" SelectCommand="SELECT * FROM [COUNTRY]" UpdateCommand="UPDATE [COUNTRY] SET [CountryName] = ? WHERE (([CountryID] = ?) OR ([CountryID] IS NULL AND ? IS NULL))"> <DeleteParameters> <asp:Parameter Name="CountryID" Type="Int32" /> </DeleteParameters> <UpdateParameters> <asp:Parameter Name="CountryName" Type="String" /> <asp:Parameter Name="CountryID" Type="Int32" /> </UpdateParameters> <InsertParameters> <asp:Parameter Name="CountryID" Type="Int32" /> <asp:Parameter Name="CountryName" Type="String" /> </InsertParameters> </asp:SqlDataSource> <asp:RequiredFieldValidator ID="RequiredFieldValidator4" runat="server" ErrorMessage="Items marked with '*' cannot be left blank." ControlToValidate="DropDownList1"></asp:RequiredFieldValidator><br /> <asp:Label ID="DepartmentOfInterest" runat="server" Text="Department of Interest*"></asp:Label> <asp:DropDownList ID="DropDownList2" runat="server" DataSourceID="SqlDataSource2" DataTextField="DepartmentName" DataValueField="DepartmentID"> </asp:DropDownList><asp:SqlDataSource ID="SqlDataSource2" runat="server" ConnectionString="<%$ ConnectionStrings:ConnectionString7 %>" DeleteCommand="DELETE FROM [DEPARTMENT] WHERE [DepartmentID] = ?" InsertCommand="INSERT INTO [DEPARTMENT] ([DepartmentID], [DepartmentName]) VALUES (?, ?)" ProviderName="<%$ ConnectionStrings:ConnectionString7.ProviderName %>" SelectCommand="SELECT * FROM [DEPARTMENT]" UpdateCommand="UPDATE [DEPARTMENT] SET [DepartmentName] = ? WHERE [DepartmentID] = ?"> <DeleteParameters> <asp:Parameter Name="DepartmentID" Type="Int32" /> </DeleteParameters> <UpdateParameters> <asp:Parameter Name="DepartmentName" Type="String" /> <asp:Parameter Name="DepartmentID" Type="Int32" /> </UpdateParameters> <InsertParameters> <asp:Parameter Name="DepartmentID" Type="Int32" /> <asp:Parameter Name="DepartmentName" Type="String" /> </InsertParameters> </asp:SqlDataSource> <asp:RequiredFieldValidator ID="RequiredFieldValidator5" runat="server" ErrorMessage="Items marked with '*' cannot be left blank." ControlToValidate="DropDownList2"></asp:RequiredFieldValidator><br /> <asp:Label ID="DesignationAppliedFor" runat="server" Text="Position Applied For*"></asp:Label> <asp:DropDownList ID="DropDownList3" runat="server" DataSourceID="SqlDataSource3" DataTextField="DesignationName" DataValueField="DesignationID"> </asp:DropDownList><asp:SqlDataSource ID="SqlDataSource3" runat="server" ConnectionString="<%$ ConnectionStrings:ConnectionString7 %>" DeleteCommand="DELETE FROM [DESIGNATION] WHERE [DesignationID] = ?" InsertCommand="INSERT INTO [DESIGNATION] ([DesignationID], [DesignationName], [DesignationStatus]) VALUES (?, ?, ?)" ProviderName="<%$ ConnectionStrings:ConnectionString7.ProviderName %>" SelectCommand="SELECT * FROM [DESIGNATION]" UpdateCommand="UPDATE [DESIGNATION] SET [DesignationName] = ?, [DesignationStatus] = ? WHERE [DesignationID] = ?"> <DeleteParameters> <asp:Parameter Name="DesignationID" Type="Int32" /> </DeleteParameters> <UpdateParameters> <asp:Parameter Name="DesignationName" Type="String" /> <asp:Parameter Name="DesignationStatus" Type="String" /> <asp:Parameter Name="DesignationID" Type="Int32" /> </UpdateParameters> <InsertParameters> <asp:Parameter Name="DesignationID" Type="Int32" /> <asp:Parameter Name="DesignationName" Type="String" /> <asp:Parameter Name="DesignationStatus" Type="String" /> </InsertParameters> </asp:SqlDataSource> <asp:RequiredFieldValidator ID="RequiredFieldValidator6" runat="server" ErrorMessage="Items marked with '*' cannot be left blank." ControlToValidate="DropDownList3"></asp:RequiredFieldValidator><br /> <asp:Label ID="TentativeFromDate" runat="server" Text="Can Join From*"></asp:Label> <asp:DropDownList ID="DropDownList4" runat="server"> <asp:ListItem>1</asp:ListItem> <asp:ListItem>2</asp:ListItem> <asp:ListItem>3</asp:ListItem> <asp:ListItem>4</asp:ListItem> <asp:ListItem>5</asp:ListItem> <asp:ListItem>6</asp:ListItem> <asp:ListItem>7</asp:ListItem> <asp:ListItem>8</asp:ListItem> <asp:ListItem>9</asp:ListItem> <asp:ListItem>10</asp:ListItem> <asp:ListItem>11</asp:ListItem> <asp:ListItem>12</asp:ListItem> </asp:DropDownList>&nbsp;<asp:DropDownList ID="DropDownList7" runat="server"> <asp:ListItem>1</asp:ListItem> <asp:ListItem>2</asp:ListItem> <asp:ListItem>3</asp:ListItem> <asp:ListItem>4</asp:ListItem> <asp:ListItem>5</asp:ListItem> <asp:ListItem>6</asp:ListItem> <asp:ListItem>7</asp:ListItem> <asp:ListItem>8</asp:ListItem> <asp:ListItem>9</asp:ListItem> <asp:ListItem>10</asp:ListItem> <asp:ListItem>11</asp:ListItem> <asp:ListItem>12</asp:ListItem> <asp:ListItem>13</asp:ListItem> <asp:ListItem>14</asp:ListItem> <asp:ListItem>15</asp:ListItem> <asp:ListItem>16</asp:ListItem> <asp:ListItem>17</asp:ListItem> <asp:ListItem>18</asp:ListItem> <asp:ListItem>19</asp:ListItem> <asp:ListItem>20</asp:ListItem> <asp:ListItem>21</asp:ListItem> <asp:ListItem>22</asp:ListItem> <asp:ListItem>23</asp:ListItem> <asp:ListItem>24</asp:ListItem> <asp:ListItem>25</asp:ListItem> <asp:ListItem>26</asp:ListItem> <asp:ListItem>27</asp:ListItem> <asp:ListItem>28</asp:ListItem> <asp:ListItem>29</asp:ListItem> <asp:ListItem>30</asp:ListItem> <asp:ListItem>31</asp:ListItem> </asp:DropDownList> <asp:DropDownList ID="DropDownList8" runat="server"> <asp:ListItem>2010</asp:ListItem> </asp:DropDownList> <asp:RequiredFieldValidator ID="RequiredFieldValidator7" runat="server" ErrorMessage="Items marked with '*' cannot be left blank." ControlToValidate="DropDownList4"></asp:RequiredFieldValidator></center> <center> <br /> <asp:Label ID="LastDegreeName" runat="server" Text="Last Degree*"></asp:Label> <asp:DropDownList ID="DropDownList5" runat="server" DataSourceID="SqlDataSource5" DataTextField="DegreeName" DataValueField="DegreeID"> </asp:DropDownList><asp:SqlDataSource ID="SqlDataSource5" runat="server" ConnectionString="<%$ ConnectionStrings:ConnectionString7 %>" DeleteCommand="DELETE FROM [DEGREE] WHERE [DegreeID] = ?" InsertCommand="INSERT INTO [DEGREE] ([DegreeID], [DegreeName]) VALUES (?, ?)" ProviderName="<%$ ConnectionStrings:ConnectionString7.ProviderName %>" SelectCommand="SELECT * FROM [DEGREE]" UpdateCommand="UPDATE [DEGREE] SET [DegreeName] = ? WHERE [DegreeID] = ?"> <DeleteParameters> <asp:Parameter Name="DegreeID" Type="Int32" /> </DeleteParameters> <UpdateParameters> <asp:Parameter Name="DegreeName" Type="String" /> <asp:Parameter Name="DegreeID" Type="Int32" /> </UpdateParameters> <InsertParameters> <asp:Parameter Name="DegreeID" Type="Int32" /> <asp:Parameter Name="DegreeName" Type="String" /> </InsertParameters> </asp:SqlDataSource> <asp:RequiredFieldValidator ID="RequiredFieldValidator8" runat="server" ErrorMessage="Items marked with '*' cannot be left blank." ControlToValidate="DropDownList5"></asp:RequiredFieldValidator><br /> <asp:Label ID="LastDegreeNameOther" runat="server" Text="Other"></asp:Label> <asp:TextBox ID="TextBox11" runat="server"></asp:TextBox><br /> <asp:Label ID="LastDegreeInstitute" runat="server" Text="Institute Name*"></asp:Label> <asp:TextBox ID="TextBox12" runat="server"></asp:TextBox> <asp:RequiredFieldValidator ID="RequiredFieldValidator9" runat="server" ErrorMessage="Items marked with '*' cannot be left blank." ControlToValidate="TextBox12"></asp:RequiredFieldValidator><br /> <asp:Label ID="LastDegreeGrade" runat="server" Text="Marks / Grade*"></asp:Label> <asp:TextBox ID="TextBox13" runat="server"></asp:TextBox> <asp:RequiredFieldValidator ID="RequiredFieldValidator10" runat="server" ErrorMessage="Items marked with '*' cannot be left blank." ControlToValidate="TextBox13"></asp:RequiredFieldValidator></center> <center> &nbsp;</center> <center> <br /> <asp:Label ID="SecondLastDegreeName" runat="server" Text="Second Last Degree*"></asp:Label> <asp:DropDownList ID="DropDownList6" runat="server" DataSourceID="SqlDataSource4" DataTextField="DegreeName" DataValueField="DegreeID"> </asp:DropDownList><asp:SqlDataSource ID="SqlDataSource4" runat="server" ConnectionString="<%$ ConnectionStrings:ConnectionString7 %>" DeleteCommand="DELETE FROM [DEGREE] WHERE [DegreeID] = ?" InsertCommand="INSERT INTO [DEGREE] ([DegreeID], [DegreeName]) VALUES (?, ?)" ProviderName="<%$ ConnectionStrings:ConnectionString7.ProviderName %>" SelectCommand="SELECT * FROM [DEGREE]" UpdateCommand="UPDATE [DEGREE] SET [DegreeName] = ? WHERE [DegreeID] = ?"> <DeleteParameters> <asp:Parameter Name="DegreeID" Type="Int32" /> </DeleteParameters> <UpdateParameters> <asp:Parameter Name="DegreeName" Type="String" /> <asp:Parameter Name="DegreeID" Type="Int32" /> </UpdateParameters> <InsertParameters> <asp:Parameter Name="DegreeID" Type="Int32" /> <asp:Parameter Name="DegreeName" Type="String" /> </InsertParameters> </asp:SqlDataSource> <asp:RequiredFieldValidator ID="RequiredFieldValidator11" runat="server" ErrorMessage="Items marked with '*' cannot be left blank." ControlToValidate="DropDownList6"></asp:RequiredFieldValidator><br /> <asp:Label ID="SecondLastDegreeNameOther" runat="server" Text="Other"></asp:Label> <asp:TextBox ID="TextBox15" runat="server"></asp:TextBox><br /> <asp:Label ID="SecondLastDegreeInstitute" runat="server" Text="Institute Name*"></asp:Label> <asp:TextBox ID="TextBox16" runat="server"></asp:TextBox> <asp:RequiredFieldValidator ID="RequiredFieldValidator12" runat="server" ErrorMessage="Items marked with '*' cannot be left blank." ControlToValidate="TextBox16"></asp:RequiredFieldValidator><br /> <asp:Label ID="SecondLastDegreeGrade" runat="server" Text="Marks / Grade*"></asp:Label> <asp:TextBox ID="TextBox17" runat="server"></asp:TextBox> <asp:RequiredFieldValidator ID="RequiredFieldValidator13" runat="server" ErrorMessage="Items marked with '*' cannot be left blank." ControlToValidate="TextBox17"></asp:RequiredFieldValidator></center> <center> <br /> <asp:Label ID="AdditionalQualification" runat="server" Text="Additional Qualification"></asp:Label> <asp:TextBox ID="TextBox18" runat="server" TextMode="MultiLine"></asp:TextBox></center> <center> &nbsp;</center> <center> <asp:Button ID="Button1" runat="server" Text="Save and Exit" OnClick="Button1_Click" /> &nbsp;&nbsp; <asp:Button ID="Button2" runat="server" Text="Next" OnClick="Button2_Click" /></center> <center> &nbsp;</center> <center> <asp:Label ID="Label1" runat="server"></asp:Label>&nbsp;</center> <center> &nbsp;</center> <center style="background-color: silver"> &nbsp;</center> </div> </form> </body> </html>

    Read the article

  • How to get the MSOnlineBackup Cmdlets?

    - by gregpakes
    I am trying to manage the Azure Online Backup from PowerShell. There is a set of Cmdlets called MSOnlineBackup. See technet.microsoft.com/en-us/library/dn249523.aspx. When I try to run: Import-Module MSOnlineBackup I get: The specified module 'MSOnlineBackup' was not loaded because no valid module file was found in any module directory. On the technet page it states that it is included in 4.0.4.0, If I run: $PSVersionTable.PSVersion It returns: Major: 4 Minor: 0 Build: -1 Revision: -1 I am running Windows 8.1. As you can probably tell I am no Powershell expert. I have also tried installed the Azure Backup Agent, but it says it needs a Server operating system. Can anyone tell me how I can get the MSOnlineBackup module on my machine so I can start automating Windows Azure Backup?

    Read the article

  • DB2 insert performance - How to measure

    - by svrist
    [From stackoverflow] Im trying to find a way to speedup my inserts to a DB2 9.7.1 (ubuntu linux) Im watching vmstat and trying to gather some statistics via the db2 get snapshot commands but im not able to figure out which numbers im looking for to be able to see where the trouble is. I've read lits of stuff like http://www.eggheadcafe.com/software/aspnet/35692526/question-multiple-row-in.aspx, and http://www.ibm.com/developerworks/data/library/tips/dm-0403wilkins/ and tricks like ALTER TABLE lalala APPEND ON works somewhat (the difference between a dd if=/dev/zero and insert is still a factor 10) but I would like to be able to find the counters or other performance indicators that actually show why it makes sense to use those tricks. For example: What is the metric called that shows me that it is buffer pages allocation (FSCR stuff) that is the problem Where do I see that the insert time is hampered by clustered indexes? I find db2top very useful but im still searching for more direct view of "this is your bottleneck" methods

    Read the article

  • MSTSC crash on connect

    - by rotard
    We are supporting a client with primarily Windows XP machines. The users need to use Remote Desktop to connect to a terminal server. Unfortunately, after upgrading to SP3 on some machines MSTSC.exe crashes when they try to connect to the terminal server (a Win 2008 machine). The resolution I have found has been to revert to an older version of MSTSC as described here: http://it.tmod.pl/Blog/EntryId/115/Remote-Desktop-Connection-crashes.aspx . Another tech at my company independently arrived at a similar solution. Unfortunately, now some of the user's printers are missing (when connected tot he terminal server). Has anyone else seen this issue? How did you resolve it?

    Read the article

  • VAMT 3.0 Proxy Activate - No ‘Apply Confirmation ID’ option at all

    - by lez
    I tried to activate my windows box in isolated network zone, so I followed the process of 'Scenario 2: Proxy Activation' in http://technet.microsoft.com/en-us/library/hh825202.aspx using two VAMT 3.0 hosts. Everything went fine (actually I'm not sure what option to choose when exporting VAMT data to .cilx file, I tried 'Export products and product keys' and 'Export proxy activation data only' anyway, is this a cause of this problem, I have no idea), until I wanted to apply the CID and activate the isolated pc. In 'Activate', there is no 'Apply Confirmation ID' option!, its only options are 'Acquire and save confirmation ID only' and 'Acquire confirmation ID, apply to selected machine(s) and activate'. The error message is 'cannot resolve remote name 'go.microsoft.com'' when I chose any of them, looks like acquire confirmation id always need to go to this url. But I just want to apply cid... Has anyone run into this, please? I searched internet, seems no answer... Any suggestion would be appreciated, thank you!

    Read the article

  • URL Rewrite in IIS7 Web Service - WebService.asmx => WebService in the SOAP doc

    - by Robert Kaucher
    I have an IIS7.5 (Server 2008 R2) based web service that I would like to make as independant on the current implementations technology as possible. I am using the URL rewrite module (http://learn.iis.net/page.aspx/734/url-rewrite-module/) to remove the .asmx portion of the URL and that is working fine for the HTTP request portion. However, I still see .asmx in the WSDL file when I access it. I was wondering if anyone has done this and if so, what advice could be offered. It doesn't seem like a hard problem to solve. But I have tries a number of things with "custom tags" and can't seem to get it working to save my life.

    Read the article

  • How can i Install Activex in windows 2008 R2

    - by jazzson
    hi,men i am testing W2K8R2 for desktop os. but Oa in my company must be installed in it? unfortunately http://technet.microsoft.com/en-us/library/dd631688(WS.10).aspx documents by Microsoft: The ActiveX Installer Service is not included in Windows Server® 2008 R2. If you attempt to install an ActiveX control from your Web browser on a computer running Windows Server 2008 R2, a User Account Control dialog box with a yellow bar will be displayed warning you that the publisher is unknown. anyone can help me ? 3q!!!

    Read the article

  • JoinDomainOrWorkgroup Method FJoinOptions help

    - by Ben
    Anyone have experience of using the JoinDomainOrWorkgroup Method of the Win32_ComputerSystem Class? I want to write a powershell script to join a machine to a domain. There may be an existing computer account for the machine, and if so I want to delete it and rejoin to the domain. I've already scripted the "search and destroy" part that will delete the computer account if it exists, but just noticed the FJoinOptions switches on Technet. Trouble is - they're a bit ambiguous. Does 4 (0x4) Deletes an account when a domain exists. mean it will delete the computer account if it already exists on the domain? Also, can you specify the computername you want to join the machine under with this method, or should you do a rename and then join the domain. Cheers, Ben NB - I've been using the guide at http://msdn.microsoft.com/en-us/library/aa392154(VS.85).aspx - not sure if there's a better resource out there.

    Read the article

  • iis7 .net webservice 404 error

    - by user11457
    I have a webservice /test/Service1.asmx in the same folder as a page /test/test.aspx. the page works fine but I get the message bellow for the services in the same location. I know the file is there and the url is correct, added the script module and managed handler as well. If anyone knows what I'm missing here I'd appreciate it Server Error in '/' Application. The resource cannot be found. Description: HTTP 404. The resource you are looking for (or one of its dependencies) could have been removed, had its name changed, or is temporarily unavailable. Please review the following URL and make sure that it is spelled correctly. Requested URL: /test/Service1.asmx Version Information: Microsoft .NET Framework Version:2.0.50727.4200; ASP.NET Version:2.0.50727.4016

    Read the article

  • Cross domain javascript form filling, reverse proxy

    - by Michel van Engelen
    I need a javascript form filler that can bypass the 'same origin policy' most modern browsers implement. I made a script that opens the desired website/form in a new browser. With the handler, returned by the window.open method, I want to retrieve the inputs with theWindowHandler.document.getElementById('inputx') and fill them (access denied). Is it possible to solve this problem by using Isapi Rewrite (official site) in IIS 6 acting like a reverse proxy? If so, how would I configure the reverse proxy? This is how far I got: RewriteEngine on RewriteLogLevel 9 LogLevel debug RewriteRule CarChecker https://the.actualcarchecker.com/CheckCar.aspx$1 [NC,P] The rewrite works, http://ourcompany.com/ourapplication/CarChecker, as evident in the logging. From within our companysite I can run the carchecker as if it was in our own domain. Except, the 'same origin policy' is still in force. Regards, Michel

    Read the article

  • Shared Excel WorkBook is locked by another user

    - by Simone
    I’ve been trying everything; this is the last chance I have. I moved folders and files from an old Windows Server 2003 File Server to a new FS (Win Server 2008 R2) with DFS and ABE enabled. Now, a specific Shared Excel file is driving me crazy, out of a sudden, lots of times per day, users are getting the following error while opening that file: Filename.xlsx is locked for editing by ‘another user’. Open ‘Read-Only’ or, click ‘Notify’ to open.. I’ve already followed this, with no joy: http://blogs.technet.com/b/the_microsoft_excel_support_team_blog/archive/2012/05/14/the-definitive-locked-file-post.aspx In any case, I strongly think this is not client-related, since it never gave that problem in the past with Windows Server 2003. I’ve found and followed many other solutions, nothing. The users are all utilizing Office 2010 on Windows 7 machines, besides a few users who are still on Windows XP machines. I appreciate any help, thank you!

    Read the article

  • Shared Excel WorkBook is locked by another user

    - by Simone
    I’ve been trying everything; this is the last chance I have. I moved folders and files from an old Windows Server 2003 File Server to a new FS (Win Server 2008 R2) with DFS and ABE enabled. Now, a specific Shared Excel file is driving me crazy, out of a sudden, lots of times per day, users are getting the following error while opening that file: Filename.xlsx is locked for editing by ‘another user’. Open ‘Read-Only’ or, click ‘Notify’ to open.. I’ve already followed this, with no joy: http://blogs.technet.com/b/the_microsoft_excel_support_team_blog/archive/2012/05/14/the-definitive-locked-file-post.aspx In any case, I strongly think this is not client-related, since it never gave that problem in the past with Windows Server 2003. I’ve found and followed many other solutions, nothing. The users are all utilizing Office 2010 on Windows 7 machines, besides a few users who are still on Windows XP machines. I appreciate any help, thank you!

    Read the article

  • Mcafee Auto-update from UNC path problem

    - by Vicky
    I have a network with 50 computers with no internet access. So instead of updating in each of them using dat file individually I tried to create a shared folder in server, and created a UNC in site repository. I downloaded the file DAT Package For Use with Mcafee AutoUpdate Architect & ePO 3.0 from http://www.mcafee.com/apps/downloads/security-updates/security-updates.aspx. When I try to update it is giving an error Error occurred while downloading file SiteStat.xml. So how fix it?

    Read the article

  • How do I get around "Access is Denied" [Number: 5 (0x80070005)], with IIS6/FastCGI and PHP 5.2.3?

    - by Evan Carroll
    I'm getting this error with IIS 6.0 (i assume), and PHP 5.2.3, and FastCGI FastCGI Error The FastCGI Handler was unable to process the request. Error Details: Error Number: 5 (0x80070005). Error Description: Access is denied. HTTP Error 500 - Server Error. Internet Information Services (IIS) Any ideas, nothing revealings in logs (other than 500 errors), this is pretty much all of I have to work with. The script has read and execute privileged for the internet guest account; and, I've added read/execute privileges to the whole D:\PHP. I followed this tutorial http://learn.iis.net/page.aspx/247/using-fastcgi-to-host-php-applications-on-iis-60/ to set it up. The only major diversion is I installed PHP to D:\PHP

    Read the article

  • How can I get Virtual Server 2005 R2 running on Windows Server 2008 R2?

    - by Bret Fisher
    For various reasons (old VT-less hardware, and .vhd support) we need to still run Virtual Server 2005 R2. It's just for lab/demo work but we'd like to run the host on the newest Windows OS possible. It's documented and at least partially supported to run the old Virtual Server 2005 R2 SP1 on Windows Server 2008 (non-R2). I've done that before. I'm wondering if anyone has gotten the scenario in the title above to work. This post says it's possible but has anyone here actually done it before I go through that process: http://blogs.infosupport.com/blogs/ericd/archive/2009/08/31/running-virtual-server-2005-r2-sp1-on-windows-server-2008-r2.aspx

    Read the article

< Previous Page | 123 124 125 126 127 128 129 130 131 132 133 134  | Next Page >