Search Results

Search found 25811 results on 1033 pages for 'visual studio 2008'.

Page 575/1033 | < Previous Page | 571 572 573 574 575 576 577 578 579 580 581 582  | Next Page >

  • Right column content overflowing over my footer

    - by Sixfoot Studio
    Hi All, I know this has got something to do with a float, but I cannot seem to figure out where I am going wrong with this. Please check this page for me, the content in the right column is flowing over my footer. http://sun-eng.sixfoot.co.za/index.php?option=com_weblinks&view=categories&Itemid=48 Thanks! James

    Read the article

  • jQuery Toggle switch images

    - by Sixfoot Studio
    Hi Guys, I have a situation where I am not understanding jQuery's Toggle. I have two buttons, both of them open the same content and when you click on either button the content should open or close and the attribute changes the button from open to closed. (So both buttons do the same function). Only thing is, when I click on the top button and it opens my content and then click on the lower button to close it, the image attributes are switched incorrectly. Here's a very stripped down version of what my code looks like and I would appreciate some help. <script language="javascript" type="text/javascript"> $(function () { var open = false; $("#button1, #button2").toggle( function () { open = true; $("#button1").attr("src", "images/btn-open.gif"); $("#button2").attr("src", "images/btn-open.gif"); }, function () { if (open) { $("#button1").attr("src", "images/btn-closed.gif"); $("#button2").attr("src", "images/btn-closed.gif"); } else { $("#button1").attr("src", "images/btn-open.gif"); $("#button2").attr("src", "images/btn-open.gif"); } open = false; } ); }); </script> <img id="button1" src="images/btn-open.gif"></img> <br /> <br /> <br /> <br /> <img id="button2" src="images/btn-open.gif"></img>

    Read the article

  • SQL SERVER – Introduction to Extended Events – Finding Long Running Queries

    - by pinaldave
    The job of an SQL Consultant is very interesting as always. The month before, I was busy doing query optimization and performance tuning projects for our clients, and this month, I am busy delivering my performance in Microsoft SQL Server 2005/2008 Query Optimization and & Performance Tuning Course. I recently read white paper about Extended Event by SQL Server MVP Jonathan Kehayias. You can read the white paper here: Using SQL Server 2008 Extended Events. I also read another appealing chapter by Jonathan in the book, SQLAuthority Book Review – Professional SQL Server 2008 Internals and Troubleshooting. After reading these excellent notes by Jonathan, I decided to upgrade my course and include Extended Event as one of the modules. This week, I have delivered Extended Events session two times and attendees really liked the said course. They really think Extended Events is one of the most powerful tools available. Extended Events can do many things. I suggest that you read the white paper I mentioned to learn more about this tool. Instead of writing a long theory, I am going to write a very quick script for Extended Events. This event session captures all the longest running queries ever since the event session was started. One of the many advantages of the Extended Events is that it can be configured very easily and it is a robust method to collect necessary information in terms of troubleshooting. There are many targets where you can store the information, which include XML file target, which I really like. In the following Events, we are writing the details of the event at two locations: 1) Ringer Buffer; and 2) XML file. It is not necessary to write at both places, either of the two will do. -- Extended Event for finding *long running query* IF EXISTS(SELECT * FROM sys.server_event_sessions WHERE name='LongRunningQuery') DROP EVENT SESSION LongRunningQuery ON SERVER GO -- Create Event CREATE EVENT SESSION LongRunningQuery ON SERVER -- Add event to capture event ADD EVENT sqlserver.sql_statement_completed ( -- Add action - event property ACTION (sqlserver.sql_text, sqlserver.tsql_stack) -- Predicate - time 1000 milisecond WHERE sqlserver.sql_statement_completed.duration > 1000 ) -- Add target for capturing the data - XML File ADD TARGET package0.asynchronous_file_target( SET filename='c:\LongRunningQuery.xet', metadatafile='c:\LongRunningQuery.xem'), -- Add target for capturing the data - Ring Bugger ADD TARGET package0.ring_buffer (SET max_memory = 4096) WITH (max_dispatch_latency = 1 seconds) GO -- Enable Event ALTER EVENT SESSION LongRunningQuery ON SERVER STATE=START GO -- Run long query (longer than 1000 ms) SELECT * FROM AdventureWorks.Sales.SalesOrderDetail ORDER BY UnitPriceDiscount DESC GO -- Stop the event ALTER EVENT SESSION LongRunningQuery ON SERVER STATE=STOP GO -- Read the data from Ring Buffer SELECT CAST(dt.target_data AS XML) AS xmlLockData FROM sys.dm_xe_session_targets dt JOIN sys.dm_xe_sessions ds ON ds.Address = dt.event_session_address JOIN sys.server_event_sessions ss ON ds.Name = ss.Name WHERE dt.target_name = 'ring_buffer' AND ds.Name = 'LongRunningQuery' GO -- Read the data from XML File SELECT event_data_XML.value('(event/data[1])[1]','VARCHAR(100)') AS Database_ID, event_data_XML.value('(event/data[2])[1]','INT') AS OBJECT_ID, event_data_XML.value('(event/data[3])[1]','INT') AS object_type, event_data_XML.value('(event/data[4])[1]','INT') AS cpu, event_data_XML.value('(event/data[5])[1]','INT') AS duration, event_data_XML.value('(event/data[6])[1]','INT') AS reads, event_data_XML.value('(event/data[7])[1]','INT') AS writes, event_data_XML.value('(event/action[1])[1]','VARCHAR(512)') AS sql_text, event_data_XML.value('(event/action[2])[1]','VARCHAR(512)') AS tsql_stack, CAST(event_data_XML.value('(event/action[2])[1]','VARCHAR(512)') AS XML).value('(frame/@handle)[1]','VARCHAR(50)') AS handle FROM ( SELECT CAST(event_data AS XML) event_data_XML, * FROM sys.fn_xe_file_target_read_file ('c:\LongRunningQuery*.xet', 'c:\LongRunningQuery*.xem', NULL, NULL)) T GO -- Clean up. Drop the event DROP EVENT SESSION LongRunningQuery ON SERVER GO Just run the above query, afterwards you will find following result set. This result set contains the query that was running over 1000 ms. In our example, I used the XML file, and it does not reset when SQL services or computers restarts (if you are using DMV, it will reset when SQL services restarts). This event session can be very helpful for troubleshooting. Let me know if you want me to write more about Extended Events. I am totally fascinated with this feature, so I’m planning to acquire more knowledge about it so I can determine its other usages. Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Optimization, SQL Performance, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQL Training, SQLServer, T SQL, Technology Tagged: SQL Extended Events

    Read the article

  • SQL SERVER – History of SQL Server Database Encryption

    - by pinaldave
    I recently met Michael Coles and Rodeney Landrum the author of one of the kind book Expert SQL Server 2008 Encryption at SQLPASS in Seattle. During the conversation we ended up how Microsoft is evolving encryption technology. The same discussion lead to talking about history of encryption tools in SQL Server. Michale pointed me to page 18 of his book of encryption. He explicitly give me permission to re-produce relevant part of history from his book. Encryption in SQL Server 2000 Built-in cryptographic encryption functionality was nonexistent in SQL Server 2000 and prior versions. In order to get server-side encryption in SQL Server you had to resort to purchasing or creating your own SQL Server XPs. Creating your own cryptographic XPs could be a daunting task owing to the fact that XPs had to be compiled as native DLLs (using a language like C or C++) and the XP application programming interface (API) was poorly documented. In addition there were always concerns around creating wellbehaved XPs that “played nicely” with the SQL Server process. Encryption in SQL Server 2005 Prior to the release of SQL Server 2005 there was a flurry of regulatory activity in response to accounting scandals and attacks on repositories of confidential consumer data. Much of this regulation centered onthe need for protecting and controlling access to sensitive financial and consumer information. With the release of SQL Server 2005 Microsoft responded to the increasing demand for built-in encryption byproviding the necessary tools to encrypt data at the column level. This functionality prominently featured the following: Support for column-level encryption of data using symmetric keys or passphrases. Built-in access to a variety of symmetric and asymmetric encryption algorithms, including AES, DES, Triple DES, RC2, RC4, and RSA. Capability to create and manage symmetric keys. Key creation and management. Ability to generate asymmetric keys and self-signed certificates, or to install external asymmetric keys and certificates. Implementation of hierarchical model for encryption key management, similar to the ANSI X9.17 standard model. SQL functions to generate one-way hash codes and digital signatures, including SHA-1 and MD5 hashes. Additional SQL functions to encrypt and decrypt data. Extensions to the SQL language to support creation, use, and administration of encryption keys and certificates. SQL CLR extensions that provide access to .NET-based encryption functionality. Encryption in SQL Server 2008 Encryption demands have increased over the past few years. For instance, there has been a demand for the ability to store encryption keys “off-the-box,” physically separate from the database and the data it contains. Also there is a recognized requirement for legacy databases and applications to take advantage of encryption without changing the existing code base. To address these needs SQL Server 2008 adds the following features to its encryption arsenal: Transparent Data Encryption (TDE): Allows you to encrypt an entire database, including log files and the tempdb database, in such a way that it is transparent to client applications. Extensible Key Management (EKM): Allows you to store and manage your encryption keys on an external device known as a hardware security module (HSM). Cryptographic random number generation functionality. Additional cryptography-related catalog views and dynamic management views. SQL language extensions to support the new encryption functionality. The encryption book covers all the tools in its various chapter in one simple story. If you are interested how encryption evolved and reached to the stage where it is today, this book is must for everyone. You can read my earlier review of the book over here. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Book Review, SQLAuthority News, T SQL, Technology Tagged: Encryption, SQL Server Encryption, SQLPASS

    Read the article

  • SQLAuthority News – Speaking Sessions at TechEd India – 3 Sessions – 1 Panel Discussion

    - by pinaldave
    Microsoft Tech-Ed India 2010 is considered as the major Technology event of the year for various IT professionals and developers. This event will feature a comprehensive forum in order   to learn, connect, explore, and evolve the current technologies we have today. I would recommend this event to you since here you will learn about today’s cutting-edge trends, thereby enhancing your work profile and getting ahead of the rest. But, the most important benefit of all might be the networking opportunity that that you can attain by attending the forum. You can build personal connections with various Microsoft experts and peers that will last even far beyond this event! It also feels good to let you know that I will be speaking at this year’s event! So, here are the sessions that await you in this mega-forum. Session 1: True Lies of SQL Server – SQL Myth Buster Date: April 12, 2010  Time: 11:15pm – 11:45pm In this 30-minute demo session, I am going to briefly demonstrate few SQL Server Myth and their resolution backing up with some demo. This demo session is a must-attend for all developers and administrators who would come to the event. This is going to be a very quick yet  fun session. Session 2: Master Data Services in Microsoft SQL Server 2008 R2 Date: April 12, 2010  Time: 2:30pm-3:30pm SQL Server Master Data Services will ship with SQL Server 2008 R2 and will improve Microsoft’s platform appeal. This session provides an in depth demonstration of MDS features and highlights important usage scenarios. Master Data Services enables consistent decision making by allowing you to create, manage and propagate changes from single master view of your business entities. Also with MDS – Master Data-hub which is the vital component helps ensure reporting consistency across systems and deliver faster more accurate results across the enterprise. We will talk about establishing the basis for a centralized approach to defining, deploying, and managing master data in the enterprise. Session 3: Developing with SQL Server Spatial and Deep Dive into Spatial Indexing Date: April 14, 2010 Time: 5:00pm-6:00pm Microsoft SQL Server 2008 delivers new spatial data types that enable you to consume, use, and extend location-based data through spatial-enabled applications. Attend this session to learn how to use spatial functionality in next version of SQL Server to build and optimize spatial queries. This session outlines the new geography data type to store geodetic spatial data and perform operations on it, use the new geometry data type to store planar spatial data and perform operations on it, take advantage of new spatial indexes for high performance queries, use the new spatial results tab to quickly and easily view spatial query results directly from within Management Studio, extend spatial data capabilities by building or integrating location-enabled applications through support for spatial standards and specifications and much more. Panel Discussion: Harness the power of Web – SEO and Technical Blogging Date: April 12, 2010 Time: 5:00pm-6:00pm Here you will learn lots of tricks and tips about SEO and Technical Blogging from various Industry Technical Blogging Experts. This event will surely be one of the most important Tech conventions of 2010. TechEd is going to be a very busy time for Tech developers and enthusiasts, since every evening there will be a fun session to attend. If you are interested in any of the above topics for every session, I suggest that you visit each of them as you will learn so many things about the topic to be discussed. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: MVP, Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, T SQL, Technology Tagged: TechEd, TechEdIn

    Read the article

  • Master Note for Generic Data Warehousing

    - by lajos.varady(at)oracle.com
    ++++++++++++++++++++++++++++++++++++++++++++++++++++ The complete and the most recent version of this article can be viewed from My Oracle Support Knowledge Section. Master Note for Generic Data Warehousing [ID 1269175.1] ++++++++++++++++++++++++++++++++++++++++++++++++++++In this Document   Purpose   Master Note for Generic Data Warehousing      Components covered      Oracle Database Data Warehousing specific documents for recent versions      Technology Network Product Homes      Master Notes available in My Oracle Support      White Papers      Technical Presentations Platforms: 1-914CU; This document is being delivered to you via Oracle Support's Rapid Visibility (RaV) process and therefore has not been subject to an independent technical review. Applies to: Oracle Server - Enterprise Edition - Version: 9.2.0.1 to 11.2.0.2 - Release: 9.2 to 11.2Information in this document applies to any platform. Purpose Provide navigation path Master Note for Generic Data Warehousing Components covered Read Only Materialized ViewsQuery RewriteDatabase Object PartitioningParallel Execution and Parallel QueryDatabase CompressionTransportable TablespacesOracle Online Analytical Processing (OLAP)Oracle Data MiningOracle Database Data Warehousing specific documents for recent versions 11g Release 2 (11.2)11g Release 1 (11.1)10g Release 2 (10.2)10g Release 1 (10.1)9i Release 2 (9.2)9i Release 1 (9.0)Technology Network Product HomesOracle Partitioning Advanced CompressionOracle Data MiningOracle OLAPMaster Notes available in My Oracle SupportThese technical articles have been written by Oracle Support Engineers to provide proactive and top level information and knowledge about the components of thedatabase we handle under the "Database Datawarehousing".Note 1166564.1 Master Note: Transportable Tablespaces (TTS) -- Common Questions and IssuesNote 1087507.1 Master Note for MVIEW 'ORA-' error diagnosis. For Materialized View CREATE or REFRESHNote 1102801.1 Master Note: How to Get a 10046 trace for a Parallel QueryNote 1097154.1 Master Note Parallel Execution Wait Events Note 1107593.1 Master Note for the Oracle OLAP OptionNote 1087643.1 Master Note for Oracle Data MiningNote 1215173.1 Master Note for Query RewriteNote 1223705.1 Master Note for OLTP Compression Note 1269175.1 Master Note for Generic Data WarehousingWhite Papers Transportable Tablespaces white papers Database Upgrade Using Transportable Tablespaces:Oracle Database 11g Release 1 (February 2009) Platform Migration Using Transportable Database Oracle Database 11g and 10g Release 2 (August 2008) Database Upgrade using Transportable Tablespaces: Oracle Database 10g Release 2 (April 2007) Platform Migration using Transportable Tablespaces: Oracle Database 10g Release 2 (April 2007)Parallel Execution and Parallel Query white papers Best Practices for Workload Management of a Data Warehouse on the Sun Oracle Database Machine (June 2010) Effective resource utilization by In-Memory Parallel Execution in Oracle Real Application Clusters 11g Release 2 (Feb 2010) Parallel Execution Fundamentals in Oracle Database 11g Release 2 (November 2009) Parallel Execution with Oracle Database 10g Release 2 (June 2005)Oracle Data Mining white paper Oracle Data Mining 11g Release 2 (March 2010)Partitioning white papers Partitioning with Oracle Database 11g Release 2 (September 2009) Partitioning in Oracle Database 11g (June 2007)Materialized Views and Query Rewrite white papers Oracle Materialized Views  and Query Rewrite (May 2005) Improving Performance using Query Rewrite in Oracle Database 10g (December 2003)Database Compression white papers Advanced Compression with Oracle Database 11g Release 2 (September 2009) Table Compression in Oracle Database 10g Release 2 (May 2005)Oracle OLAP white papers On-line Analytic Processing with Oracle Database 11g Release 2 (September 2009) Using Oracle Business Intelligence Enterprise Edition with the OLAP Option to Oracle Database 11g (July 2008)Generic white papers Enabling Pervasive BI through a Practical Data Warehouse Reference Architecture (February 2010) Optimizing and Protecting Storage with Oracle Database 11g Release 2 (November 2009) Oracle Database 11g for Data Warehousing and Business Intelligence (August 2009) Best practices for a Data Warehouse on Oracle Database 11g (September 2008)Technical PresentationsA selection of ObE - Oracle by Examples documents: Generic Using Basic Database Functionality for Data Warehousing (10g) Partitioning Manipulating Partitions in Oracle Database (11g Release 1) Using High-Speed Data Loading and Rolling Window Operations with Partitioning (11g Release 1) Using Partitioned Outer Join to Fill Gaps in Sparse Data (10g) Materialized View and Query Rewrite Using Materialized Views and Query Rewrite Capabilities (10g) Using the SQLAccess Advisor to Recommend Materialized Views and Indexes (10g) Oracle OLAP Using Microsoft Excel With Oracle 11g Cubes (how to analyze data in Oracle OLAP Cubes using Excel's native capabilities) Using Oracle OLAP 11g With Oracle BI Enterprise Edition (Creating OBIEE Metadata for OLAP 11g Cubes and querying those in BI Answers) Building OLAP 11g Cubes Querying OLAP 11g Cubes Creating Interactive APEX Reports Over OLAP 11g CubesSelection of presentations from the BIWA website:Extreme Data Warehousing With Exadata  by Hermann Baer (July 2010) (slides 2.5MB, recording 54MB)Data Mining Made Easy! Introducing Oracle Data Miner 11g Release 2 New "Work flow" GUI   by Charlie Berger (May 2010) (slides 4.8MB, recording 85MB )Best Practices for Deploying a Data Warehouse on Oracle Database 11g  by Maria Colgan (December 2009)  (slides 3MB, recording 18MB, white paper 3MB )

    Read the article

  • ASP.NET Routing not working on IIS 7.0

    - by Rick Strahl
    I ran into a nasty little problem today when deploying an application using ASP.NET 4.0 Routing to my live server. The application and its Routing were working just fine on my dev machine (Windows 7 and IIS 7.5), but when I deployed (Windows 2008 R1 and IIS 7.0) Routing would just not work. Every time I hit a routed url IIS would just throw up a 404 error: This is an IIS error, not an ASP.NET error so this doesn’t actually come from ASP.NET’s routing engine but from IIS’s handling of expressionless URLs. Note that it’s clearly falling through all the way to the StaticFile handler which is the last handler to fire in the typical IIS handler list. In other words IIS is trying to parse the extension less URL and not firing it into ASP.NET but failing. As I mentioned on my local machine this all worked fine and to make sure local and live setups match I re-copied my Web.config, double checked handler mappings in IIS and re-copied the actual application assemblies to the server. It all looked exactly matched. However no workey on the server with IIS 7.0!!! Finally, totally by chance, I remembered the runAllManagedModulesForAllRequests attribute flag on the modules key in web.config and set it to true: <system.webServer> <modules runAllManagedModulesForAllRequests="true"> <add name="ScriptCompressionModule" type="Westwind.Web.ScriptCompressionModule,Westwind.Web" /> </modules> </system.webServer> And lo and behold, Routing started working on the live server and IIS 7.0! This seems really obvious now of course, but the really tricky thing about this is that on IIS 7.5 this key is not necessary. So on my Windows 7 machine ASP.NET Routing was working just fine without the key set. However on IIS 7.0 on my live server the same missing setting was not working. On IIS 7.0 this key must be present or Routing will not work. Oddly on IIS 7.5 it appears that you can’t even turn off the behavior – setting runtAllManagedModuleForAllRequests="false" had no effect at all and Routing continued to work just fine even with the flag set to false, which is NOT what I would have expected. Kind of disappointing too that Windows Server 2008 (R1) can’t be upgraded to IIS 7.5. It sure seems like that should have been possible since the OS server core changes in R2 are pretty minor. For the future I really hope Microsoft will allow updating IIS versions without tying them explicitly to the OS. It looks like that with the release of IIS Express Microsoft has taken some steps to untie some of those tight OS links from IIS. Let’s hope that’s the case for the future – it sure is nice to run the same IIS version on dev and live boxes, but upgrading live servers is too big a deal to do just because an updated OS release came out. Moral of the story – never assume that your dev setup will work as is on the live setup. It took me forever to figure this out because I assumed that because my web.config on the local machine was fine and working and I copied all relevant web.config data to the server it can’t be the configuration settings. I was looking everywhere but in the .config file forever before getting desperate and remembering the flag when I accidentally checked the intellisense settings in the modules key. Never assume anything. The other moral is: Try to keep your dev machine and server OS’s in sync whenever possible. Maybe it’s time to upgrade to Windows Server 2008 R2 after all. More info on Extensionless URLs in IIS Want to find out more exactly on how extensionless Urls work on IIS 7? The check out  How ASP.NET MVC Routing Works and its Impact on the Performance of Static Requests which goes into great detail on the complexities of the process. Thanks to Jeff Graves for pointing me at this article – a great linked reference for this topic!© Rick Strahl, West Wind Technologies, 2005-2011Posted in IIS7  Windows  

    Read the article

  • Implementing Database Settings Using Policy Based Management

    - by Ashish Kumar Mehta
    Introduction Database Administrators have always had a tough time to ensuring that all the SQL Servers administered by them are configured according to the policies and standards of organization. Using SQL Server’s  Policy Based Management feature DBAs can now manage one or more instances of SQL Server 2008 and check for policy compliance issues. In this article we will utilize Policy Based Management (aka Declarative Management Framework or DMF) feature of SQL Server to implement and verify database settings on all production databases. It is best practice to enforce the below settings on each Production database. However, it can be tedious to go through each database and then check whether the below database settings are implemented across databases. In this article I will explain it to you how to utilize the Policy Based Management Feature of SQL Server 2008 to create a policy to verify these settings on all databases and in cases of non-complaince how to bring them back into complaince. Database setting to enforce on each user database : Auto Close and Auto Shrink Properties of database set to False Auto Create Statistics and Auto Update Statistics set to True Compatibility Level of all the user database set as 100 Page Verify set as CHECKSUM Recovery Model of all user database set to Full Restrict Access set as MULTI_USER Configure a Policy to Verify Database Settings 1. Connect to SQL Server 2008 Instance using SQL Server Management Studio 2. In the Object Explorer, Click on Management > Policy Management and you will be able to see Policies, Conditions & Facets as child nodes 3. Right click Policies and then select New Policy…. from the drop down list as shown in the snippet below to open the  Create New Policy Popup window. 4. In the Create New Policy popup window you need to provide the name of the policy as “Implementing and Verify Database Settings for Production Databases” and then click the drop down list under Check Condition. As highlighted in the snippet below click on the New Condition… option to open up the Create New Condition window. 5. In the Create New Condition popup window you need to provide the name of the condition as “Verify and Change Database Settings”. In the Facet drop down list you need to choose the Facet as Database Options as shown in the snippet below. Under Expression you need to select Field value as @AutoClose and then choose Operator value as ‘ = ‘ and finally choose Value as False. Now that you have successfully added the first field you can now go ahead and add rest of the fields as shown in the snippet below. Once you have successfully added all the above shown fields of Database Options Facet, click OK to save the changes and to return to the parent Create New Policy – Implementing and Verify Database Settings for Production Database windows where you will see that the newly created condition “Verify and Change Database Settings” is selected by default. Continues…

    Read the article

  • MacGyver Moments

    - by Geoff N. Hiten
    Denny Cherry tagged me to write about my best MacGyver Moment.  Usually I ignore blogosphere fluff and just use this space to write what I think is important.  However, #MVP10 just ended and I have a stronger sense of community.  Besides, where else would I mention my second best Macgyver moment was making a BIOS jumper out of a soda can.  Aluminum is conductive and I didn't have any real jumpers lying around. My best moment is probably my entire home computer network.  Every system but one is hand-built, usually cobbled together out of spare parts and 'adapted' from its original purpose. My Primary Domain Controller is a Dell 2300.   The Service Tag indicates it was shipped to the original owner in 1999.  Box has a PERC/1 RAID controller.  I acquired this from a previous employer for $50.  It runs Windows Server 2003 Enterprise Edition.  Does DNS, DHCP, and RADIUS services as a bonus.  RADIUS authentication is used for VPN and Wireless access.  It is nice to sign in once and be done with it. The Secondary Domain Controller is an old desktop.  Dual P-III 933 with some extra drives. My VPN box is a P-II 250 with 384MB of RAM and a 21 GB hard drive.  I did a P-to-V to my Hyper-V box a year or so ago and retired the hardware again.  Dynamic DNS lets me connect no matter how often Comcast shuffles my IP. The Hyper-V box is a desktop system with 8GB RAM and an AMD Athlon 5000+ processor.  Cost me less than $500 to put together nearly two years ago.  I reasoned that if Vista and Windows 2008 were the same code then Vista 64-bit certified meant the drivers for Vista would load into Windows 2008.  Turns out I was right. Later I added three 1TB drives but wasn't too happy with how that turned out.  I recovered two of the drives yesterday and am building an iSCSI storage unit. (Much thanks to Starwind.  Great product).  I am using an old AMD 1.1GhZ box with 1.5 GB RAM (cobbled together from three old PCs) as my storave server.  The Hyper-V box is slated for an OS rebuild to 2008 R2 once I get the storage system worked out.  maybe in a week or two. A couple of DLink Gigabit switches ties everything together. Add in the Vonage box, the three PCs, the Wireless-N Access Point, the two notebooks and the XBox and you have gone from MacGyver to darn near Rube Goldberg. The only thing I really spend money on is power supplies and fans.  I buy top-of-the-line for both. I even pull and crimp my own cables. Oh, and if my kids hose up a PC, I have all of their data on a server elsewhere.  Every PC and laptop is pretty much interchangable for email and basic workstation tasks.  That helps a lot too. Of course I will tag SQLVariant.

    Read the article

  • LSI SAS 9240-8i on Ubuntu 12.04 Hangs on Modprobe

    - by Francois Stark
    I used the LSI 9240-8i card on a smaller Intel motherboard with no problems in Ubuntu, with ZFS. However, we rebuilt the server to allow for more disks, using the ASROCK X79 Extreme 11 motherboard. It has 7 PCIe slots, and a LSI 2008 on-board. At first I thought the LSI 9240, when plugged in to PCIe, clashed with the on-board LSI 2008. Every time I plugged in the LSI 9240, modprobe would hang. Then I completely disabled the on-board LSI 2008, and the problem persisted. Last night it booted perfectly ONCE - all LSI cards and connected disks visible... However, all subsequent reboots failed. Both LSI cards' bios scans appear and they both see the disks connected to them, but Ubuntu modprobe hangs. Some selected dmesg lines, with both LSI cards enabled: [ 190.752100] megasas: [ 0]waiting for 1 commands to complete [ 195.772071] megasas: [ 5]waiting for 1 commands to complete [ 200.792079] megasas: [10]waiting for 1 commands to complete [ 205.812078] megasas: [15]waiting for 1 commands to complete [ 210.832037] megasas: [20]waiting for 1 commands to complete [ 215.852077] megasas: [25]waiting for 1 commands to complete [ 220.872072] megasas: [30]waiting for 1 commands to complete [ 225.892078] megasas: [35]waiting for 1 commands to complete [ 230.912086] megasas: [40]waiting for 1 commands to complete [ 235.932075] megasas: [45]waiting for 1 commands to complete [ 240.306157] usb 2-1.5: USB disconnect, device number 7 [ 240.952076] megasas: [50]waiting for 1 commands to complete [ 240.960034] INFO: task modprobe:233 blocked for more than 120 seconds. [ 240.960055] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message. [ 240.960067] modprobe D ffffffff81806200 0 233 146 0x00000004 [ 240.960075] ffff880806ae3b48 0000000000000086 ffff880806ae3ae8 ffffffff8101adf3 [ 240.960083] ffff880806ae3fd8 ffff880806ae3fd8 ffff880806ae3fd8 0000000000013780 [ 240.960090] ffffffff81c0d020 ffff880806acae00 ffff880806ae3b58 ffff880808961720 [ 240.960096] Call Trace: [ 240.960107] [<ffffffff8101adf3>] ? native_sched_clock+0x13/0x80 [ 240.960116] [<ffffffff816579cf>] schedule+0x3f/0x60 [ 240.960137] [<ffffffffa00093f5>] megasas_issue_blocked_cmd+0x75/0xb0 [megaraid_sas] [ 240.960144] [<ffffffff8108aa50>] ? add_wait_queue+0x60/0x60 [ 240.960154] [<ffffffffa000a6c9>] megasas_get_seq_num+0xd9/0x260 [megaraid_sas] [ 240.960164] [<ffffffffa000ab31>] megasas_start_aen+0x31/0x60 [megaraid_sas] [ 240.960174] [<ffffffffa00136f1>] megasas_probe_one+0x69a/0x81c [megaraid_sas] [ 240.960182] [<ffffffff813345bc>] local_pci_probe+0x5c/0xd0 [ 240.960189] [<ffffffff81335e89>] __pci_device_probe+0xf9/0x100 [ 240.960197] [<ffffffff8130ce6a>] ? kobject_get+0x1a/0x30 [ 240.960205] [<ffffffff81335eca>] pci_device_probe+0x3a/0x60 [ 240.960212] [<ffffffff813f5278>] really_probe+0x68/0x190 [ 240.960217] [<ffffffff813f5505>] driver_probe_device+0x45/0x70 [ 240.960223] [<ffffffff813f55db>] __driver_attach+0xab/0xb0 [ 240.960227] [<ffffffff813f5530>] ? driver_probe_device+0x70/0x70 [ 240.960233] [<ffffffff813f5530>] ? driver_probe_device+0x70/0x70 [ 240.960237] [<ffffffff813f436c>] bus_for_each_dev+0x5c/0x90 [ 240.960243] [<ffffffff813f503e>] driver_attach+0x1e/0x20 [ 240.960248] [<ffffffff813f4c90>] bus_add_driver+0x1a0/0x270 [ 240.960255] [<ffffffffa001e000>] ? 0xffffffffa001dfff [ 240.960260] [<ffffffff813f5b46>] driver_register+0x76/0x140 [ 240.960266] [<ffffffffa001e000>] ? 0xffffffffa001dfff [ 240.960271] [<ffffffff81335b66>] __pci_register_driver+0x56/0xd0 [ 240.960277] [<ffffffffa001e000>] ? 0xffffffffa001dfff [ 240.960286] [<ffffffffa001e09e>] megasas_init+0x9e/0x1000 [megaraid_sas] [ 240.960294] [<ffffffff81002040>] do_one_initcall+0x40/0x180 [ 240.960301] [<ffffffff810a82fe>] sys_init_module+0xbe/0x230 [ 240.960307] [<ffffffff81661ec2>] system_call_fastpath+0x16/0x1b [ 240.960314] INFO: task scsi_scan_7:349 blocked for more than 120 seconds.

    Read the article

  • My Tech Ed North America Preview - Certification Edition

    - by Chris Gardner
    In my previous TechEd North America Preview, I addressed all the content I wanted to see at the show. This time, we shall turn our attention to the certifications I might try to pick up. If you have never been to TechEd North America before, one of the greatest things about the event is an on-site certification center. If you have a couple hours to spare, you can walk up to a test. The first test on my agenda is 70-5231. I took this update test once, but did not do well on the MVC portion2. A few practice tests later, and I think I'm ready to fake that section. After that, I need to complete my road to being a master. The good folks here at work have been having a real love / hate relationship with the idea of me become an MCM in SQL Server3. Of course, before I do that, I need to finally take the SQL Administration tests. Thus, we shall add 70-4324 and 70-4505 to the list. Speaking of MCM, TechEd North America will have a special on test 88-9706. This test is normally $500, and you have to find a place to take it7. However, there is a special 50% off rate for people who take it on location. With those kind of prices, I may just take it as a form of study guide. As a final push, I may take some Windows Phone exams. I mentioned in my previous post that I may attend the 70-5998 Exam Cram session. Unfortunately, I will be staffing the Hands-On-Lab at that time. As we know, this has never stopped me from taking a test. This may lead to fits of 70-5069, but after we've come this far... That should complete my list. Do I really think I'll find time to take 6 tests at TechEd North America? Probably not. I have done it at TechEd North America before, but that was before I was TechEd North America staff. I also had a co-worker pass 9 in one year, but he basically did nothing but travel to Orlando in 2007 to take tests. And what's the point of attending a HUGE conference if you don't network? Of course, networking will have to wait for Friday's post... 1 Upgrade: Transition Your MCPD .NET Framework 3.5 Web Developer Skills to MCPD .NET Framework 4 Web Developer 2Because I never have used, nor do I really think I ever will use, MVC... 3By that, I mean they love the idea, and they hate the price 4Microsoft SQL Server 2008, Implementation and Maintenance 5PRO: Designing, Optimizing and Maintaining a Database Administrative Solution Using Microsoft SQL Server 2008 6SQL Server 2008 Microsoft Certified Master: Knowledge Exam 7Which isn't nearly as expensive as the Lab Exam, nor as difficult to find a location. However, it is not offered at every testing facility. 8PRO: Designing and Developing Windows Phone Applications 9TS: Silverlight 4, Development

    Read the article

  • Windows Azure SDK 1.3 addresses early adopter feedback

    - by Eric Nelson
    At the end of November 2010 we released a new version of the Windows Azure SDK which contains many new features driven by the great feedback of early adopters plus a shiny new portal. New Portal implemented in Silverlight: The new portal is implemented using Silverlight and replaces the (IMHO rather clunky) original HTML + JavaScript portal. It is 100% better although does still have a few bugs. Enjoy! P.S. You can if you wish still use the old portal:   New runtime functionality: The following functionality is now generally available through the Windows Azure SDK and Windows Azure Tools for Visual Studio and the new Windows Azure Management Portal: Elevated Privileges and Full IIS. You can now run a portion or all of your code in Web and Worker roles with elevated administrator privileges. The Web role now provides Full IIS functionality, which enables multiple IIS sites per Web role and the ability to install IIS modules. Remote Desktop functionality enables you to connect to a running instance of your application or service in order to monitor activity and troubleshoot common problems. Windows Server 2008 R2 Roles: Windows Azure now supports Windows Server 2008 R2 in its Web, worker and VM roles. This new support enables you to take advantage of the full range of Windows Server 2008 R2 features such as IIS 7.5, AppLocker, and enhanced command-line and automated management using PowerShell Version 2.0. New runtime functionality – in beta: Windows Azure Virtual Machine Role: Support for more types of new and existing Windows applications will soon be available with the introduction of the Virtual Machine (VM) role. You can move more existing applications to Windows Azure, reducing the need to make costly code or deployment changes. Extra Small Windows Azure Instance, which is priced at $0.05 per compute hour, provides developers with a cost-effective training and development environment. Developers can also use the Extra Small instance to prototype cloud solutions at a lower cost. Windows Azure Connect: (formerly Project Sydney), which enables a simple and easy-to-manage mechanism to set up IP-based network connectivity between on-premises and Windows Azure resources, is the first Windows Azure Virtual Network feature that we’re making available as a CTP. You can sign up for any of the betas via the Windows Azure Management Portal. Improved processes and simplified operations New portal! (see above) Access to new diagnostic information including the ability to click on a role to see role type, deployment time and last reboot time A new sign-up process that dramatically reduces the number of steps needed to sign up for Windows Azure. New scenario based Windows Azure Platform forums to help answer questions and share knowledge more efficiently. Multiple Service Administrators: Windows Azure now supports multiple Windows Live IDs to have administrator privileges on the same Windows Azure account. The objective is to make it easy for a team to work on the same Windows Azure account while using their individual Windows Live IDs.   Related Links Please also let us know through Microsoft Platform Ready if and when you intend to build an application using the Windows Azure Platform. Or indeed if you already have (Well done). You will get access to some great benefits if you do (more on that in a future post). It also really helps us better understand the demand out there which directly impacts how we will plan the next six months of activities around the Windows Azure Platform. Visit Microsoft Platform Ready to tell us about your plans for your applications UK based? Interested in the Windows Azure Platform? Join http://ukazure.ning.com Get started with the Windows Azure Platform http://bit.ly/startazure

    Read the article

  • Perm SSIS Developer Urgently Required

    - by blakmk
      Job Role To provide dedicated data services support to the company, by designing, creating, maintaining and enhancing database objects, ensuring data quality, consistency and integrity. Migrating data from various sources to central SQL 2008 data warehouse will be the primary function. Migration of data from bespoke legacy database’s to SQL 2008 data warehouse. Understand key business requirements, Liaising with various aspects of the company. Create advanced transformations of data, with focus on data cleansing, redundant data and duplication. Creating complex business rules regarding data services, migration, Integrity and support (Best Practices). Experience ·         Minimum 3 year SSIS experience, in a project or BI Development role and involvement in at least 3 full ETL project life cycles, using the following methodologies and tools o    Excellent knowledge of ETL concepts including data migration & integrity, focusing on SSIS. o    Extensive experience with SQL 2005 products, SQL 2008 desirable. o    Working knowledge of SSRS and its integration with other BI products. o    Extensive knowledge of T-SQL, stored procedures, triggers (Table/Database), views, functions in particular coding and querying. o    Data cleansing and harmonisation. o    Understanding and knowledge of indexes, statistics and table structure. o    SQL Agent – Scheduling jobs, optimisation, multiple jobs, DTS. o    Troubleshoot, diagnose and tune database and physical server performance. o    Knowledge and understanding of locking, blocks, table and index design and SQL configuration. ·         Demonstrable ability to understand and analyse business processes. ·         Experience in creating business rules on best practices for data services. ·         Experience in working with, supporting and troubleshooting MS SQL servers running enterprise applications ·         Proven ability to work well within a team and liaise with other technical support staff such as networking administrators, system administrators and support engineers. ·         Ability to create formal documentation, work procedures, and service level agreements. ·         Ability to communicate technical issues at all levels including to a non technical audience. ·         Good working knowledge of MS Word, Excel, PowerPoint, Visio and Project.   Location Based in Crawley with possibility of some remote working Contact me for more info: http://sqlblogcasts.com/blogs/blakmk/contact.aspx      

    Read the article

  • Blob container creation exception ...

    - by Egon
    I get an exception every time I try to create a container for the blob using the following code blobStorageType = storageAccInfo.CreateCloudBlobClient(); ContBlob = blobStorageType.GetContainerReference(containerName); //everything fine till here ; next line creates an exception ContBlob.CreateIfNotExist(); Microsoft.WindowsAzure.StorageClient.StorageClientException was unhandled Message="One of the request inputs is out of range." Source="Microsoft.WindowsAzure.StorageClient" StackTrace: at Microsoft.WindowsAzure.StorageClient.Tasks.Task1.get_Result() at Microsoft.WindowsAzure.StorageClient.Tasks.Task1.ExecuteAndWait() at Microsoft.WindowsAzure.StorageClient.TaskImplHelper.ExecuteImplWithRetry[T](Func2 impl, RetryPolicy policy) at Microsoft.WindowsAzure.StorageClient.CloudBlobContainer.CreateIfNotExist(BlobRequestOptions options) at Microsoft.WindowsAzure.StorageClient.CloudBlobContainer.CreateIfNotExist() at WebRole1.BlobFun..ctor() in C:\Users\cloud\Documents\Visual Studio 2008\Projects\CloudBlob\WebRole1\BlobFun.cs:line 58 at WebRole1.BlobFun.calling1() in C:\Users\cloud\Documents\Visual Studio 2008\Projects\CloudBlob\WebRole1\BlobFun.cs:line 29 at AzureBlobTester.Program.Main(String[] args) in C:\Users\cloud\Documents\Visual Studio 2008\Projects\CloudBlob\AzureBlobTester\Program.cs:line 19 at System.AppDomain._nExecuteAssembly(Assembly assembly, String[] args) at System.AppDomain.ExecuteAssembly(String assemblyFile, Evidence assemblySecurity, String[] args) at Microsoft.VisualStudio.HostingProcess.HostProc.RunUsersAssembly() at System.Threading.ThreadHelper.ThreadStart_Context(Object state) at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state) at System.Threading.ThreadHelper.ThreadStart() InnerException: System.Net.WebException Message="The remote server returned an error: (400) Bad Request." Source="System" StackTrace: at System.Net.HttpWebRequest.EndGetResponse(IAsyncResult asyncResult) at Microsoft.WindowsAzure.StorageClient.EventHelper.ProcessWebResponse(WebRequest req, IAsyncResult asyncResult, EventHandler1 handler, Object sender) InnerException: Do you guys knw what is it that I am doing wrong ?

    Read the article

  • Unicode Collations problem ?

    - by Bayonian
    (.NET 3.5 SP1, VS 2008, VB.NET, MSSQL Server 2008) I'm writing a small web app to test the Khmer Unicode and Lao Unicode. I have a table that store text in Khmer Unicode with the following structure : [t_id] [int] IDENTITY(1,1) NOT NULL [t_chid] [int] NOT NULL [t_vn] [int] NOT NULL [t_v] [nvarchar](max) NOT NULL I can use Linq to SQL to do CRUD normally. The text display properly on the web page, even though I didn't change the default collation of MSSQL Server 2008. When it comes to search the column [t_v], the page will take a very long time to load and in fact, it loads every row of that column. It never compares with the "key word" criteria that I use for the search. Here's my query for the search : Public Shared Function SearchTestingKhmerTable(ByVal keyword As String) As DataTable Dim db As New BibleDataClassesDataContext() Dim query = From b In db.khmer_books _ From ch In db.khmer_chapters _ From v In db.testing_khmers _ Where v.t_v.Contains(keyword) And ch.kh_book_id = b.kh_b_id And v.t_chid = ch.kh_ch_id _ Select b.kh_b_id, b.kh_b_title, ch.kh_ch_id, ch.kh_ch_number, v.t_id, v.t_vn, v.t_v Dim dtDataTableOne = New DataTable("dtOne") dtDataTableOne.Columns.Add("bid", GetType(Integer)) dtDataTableOne.Columns.Add("btitle", GetType(String)) dtDataTableOne.Columns.Add("chid", GetType(Integer)) dtDataTableOne.Columns.Add("chn", GetType(Integer)) dtDataTableOne.Columns.Add("vid", GetType(Integer)) dtDataTableOne.Columns.Add("vn", GetType(Integer)) dtDataTableOne.Columns.Add("verse", GetType(String)) For Each r In query dtDataTableOne.Rows.Add(New Object() {r.kh_b_id, r.kh_b_title, r.kh_ch_id, r.kh_ch_number, r.t_id, r.t_vn, r.t_v}) Next Return dtDataTableOne End Function Please note that I use the exact same code and database design with Lao Unicode and it works just fine. I get the returned query as expected for the search. I can't figure out what the problem with searching for query in Khmer table.

    Read the article

  • How to utilize WebDev.WebServer.exe (VS Web Server) in x64?

    - by Nick Craver
    Visual Studio is x86 until at least the 2010 release comes around, my question is can anyone think of a way or know of an independent ASP.NET debug server that's x64 for 2008? Background: Our ASP.NET application runs against Oracle as the DB. Since we're on 64-bit servers for memory concerns later, we need to use Oracle's 64-bit drivers (Instant Client). Setup: x64 OS (XP or Windows 7) IIS (5 or 7, both x64 App Pools) Oracle 64-bit Instant Client (Separate Directory, in the PATH) Visual Studio 2008 SP1 In IIS the application pool runs as 64-bit, uses the Oracle drivers as intended, however since WebDev.WebServer.exe is 32-bit you'll get a BadImageFormatException because it's trying to load 64-bit driver DLLs in a 32-bit environment. All of our developers would like to be able to use the quick debug server via Visual Studio 2008, but since it runs as 32-bit we're unable to. Some problems we run into are during application startup, so although we're attaching to the IIS process sometimes that isn't enough to track an issue down. Are there any alternatives, or work-arounds? We would like to match our Dev/Val/Prod tiers as much as possible, so everything running in x64 would be ideal.

    Read the article

  • Help/Questions About New Team Foundation Server 2010 Installation

    - by user579218
    Hello. Before starting down the TFS2010 installation process, I have a few questions I'm hoping the community can help me with. We're planning on a single-server installation of TFS2010. Initially, we want version/source control and build services, but not reporting or SharePoint. We may add reporting and SharePoint capabilities later. Our environment will be Windows Server 2008 R2 (x64), SQL Server 2008 R2 (x64), Office 2010 (x86), Visual Studio 6 and 2010, and, of course, Team Foundation Server 2010. Can I install TFS2010 on a server that is on our domain? It's not a domain controller, it's just a member server on the domain. Should I install TFS2010 before or after putting the server on the domain? We have six developers that will be logging into their local development computers (which are also on the same domain) using their domain user accounts, do I add each domain user to the TFS2010 server's security groups? If so, which one(s)? Can I or should I use a domain user account as the TFS2010 service account? Or, should I just use Network Service? The TFS2010 install guide notes that none of the service accounts should belong to the Administrators security group, so which security group(s) are recommended for the service account(s)? We're planning on using a local instance of SQL Server 2008 R2 Standard with TFS2010, what service account should we use? Should we use the same domain account as TFS2010 or Local System or ?? The TFS2010 install guide isn't very specific on this. Since we're planning on this server being both the version/source control and build server, should we install our development environments (VS6, VS2010, Access2010) before installing TFS2010? Or does it matter? Thanks in advance for answering these questions.

    Read the article

  • EF4 Generate Database

    - by shaneseaton
    Hi, I am trying my hardest to find the simplest way to create a basic "model first" entity framework example. However I am struggling with the actually generation of the database, particularly the running of the SQL against the database. Tools Visual Studio 2010 SQL Server 2008 Express Process Create a new class project Add a new Server-Database item (mdf) named "Database1.mdf" to the project Add an empty ADO.net Entity Model Create a simple entity (Person: Id, Name) Generate the Script selecting the Database1 connection created for me by visual studio Right click the script editor and select the "Execute SQL..." option Log in to SQLEXPRESS This is where is falls over saying it cant find a database name "Database1". The "problem" is that the SQL server has not had Database1 attached to it. I am 100% positive that Visual Studio use to attach a database to SQLExpress when it created a new database (Step 2). This appears to not be the case any more (even the beta of VS2010 did it). Can someone confirm this? or tell me how to get this to happen? Is there a way that I can modify the TSQL script to use an un-attached database. ie a file. I know I can use SQL Management Studio or sqlcmd to attach the database, but I would ideally like to avoid the solutions as I would like to see the cleanest method of just using visual studio. Ideal Solutions (in order of most prefered) Get visual studio to attach the newly created database Modify the generated SQL to point to file Thanks in advance.

    Read the article

  • Linq to SQL Problem System.Data.Linq.IdentityManager.StandardIdentityManager.MultiKeyManager

    - by luckyluke
    I have a really tricky thing going up here. My project has around 100 tables and they are all mapped by LINQ. Everything works fine in a dev and test environment. These enviroments are MS Win 2008 r2 servers with SQL 2008 sp1 databases. IIS and SQL are on a different machines. Now on production enviroment which is MS Win 2003 x64 web farm + geoclustered SQL 2008 IT DOES not work. All I get is the exception System.IndexOutOfRangeException: Index was outside the bounds of the array. at System.Data.Linq.IdentityManager.StandardIdentityManager.MultiKeyManager3.TryCreateKeyFr>om Values(Object[] values, MultiKey& k) at System.Data.Linq.IdentityManager.StandardIdentityManager.IdentityCache2.Find(Object[] keyValues) at System.Data.Linq.ChangeProcessor.GetOtherItem(MetaAssociation assoc, Object instance) at System.Data.Linq.ChangeProcessor.BuildEdgeMaps() at System.Data.Linq.ChangeProcessor.SubmitChanges(ConflictMode failureMode) at System.Data.Linq.DataContext.SubmitChanges(ConflictMode failureMode) at ERS.IIMP.Services.ExposuresSrv.Update(Int32 ExpID, Int32 AssID) Services\ExposuresSrv.cs` My question is What the hell. They have precisely the same DBML, the DB has exactly THE SAME structure (when I get the DB from prod to TEST and mount it eveything works just great), the binaries on the WEB Server are the same. I seriously do not know what to do.... Did anyone found that Linq works on one env and does not on the second?? I mam really lost here. I really hope You can help me:)

    Read the article

  • Parse Text using scanner useDelimiter

    - by Brian
    Looking to parse the following text file: Sample text file: <2008-10-07text entered by user<2008-11-26additional text entered by user I would like to parse the above text so that I can have three variables: v1 = 2008-10-07 v2 = text entered by user v3 = Ted Parlor v1 = 2008-11-26 v2 = additional text entered by user v3 = Ted Parlor I attempted to use scanner and useDelimiter, however, I'm having issue on how to set this up to have the results as stated above. Here's my first attempt: enter code here import java.io.*; import java.util.Scanner; public class ScanNotes { public static void main(String[] args) throws IOException { Scanner s = null; try { //String regex = "(?<=\<)([^\*)(?=\)"; s = new Scanner(new BufferedReader(new FileReader("cur_notes.txt"))); s.useDelimiter("[<]+"); while (s.hasNext()) { String v1 = s.next(); String v2= s.next(); System.out.println("v1= " + v1 + " v2=" + v2); } } finally { if (s != null) { s.close(); } } } } The results is as follows: v1= 2008-10-07text entered by user v2=Ted Parlor What I desire is: v1= 2008-10-07 v2=text entered by user v3=Ted Parlor v1= 2008-11-26 v2=additional text entered by user v3=Ted Parlor Any help that would allow me to extract all three strings separately would be greatly appreciated.

    Read the article

  • #Error showing up in multiple LEFT JOIN statement Access query when value should be NULL

    - by lar
    I'm trying to return an ID's last 4 years of data, if existing. The table (call it A_TABLE) looks like this: ID, Year, Val The idea behind the query is this: for each ID/Year in the table, LEFT JOIN with Year-1, Year-2, and Year-3 (to get 4 years of data) and then return Val for each year. Here's the SQL: SELECT a.ID, a.year AS [Year], a.Val AS VAL, a1.year AS [Year-1], a1.Val AS [VAL-1], a2.year AS [Year-2], a2.Val AS [VAL-2], a3.year AS [Year-3], a3.Val AS [VAL-3] FROM ( ([A_TABLE] AS a LEFT JOIN [A_TABLE] AS a1 ON (a.ID = a1.ID) AND (a.year = a1.year+1)) LEFT JOIN [A_TABLE] AS a2 ON (a.ID = a2.ID) AND (a.year = a2.year+2)) LEFT JOIN [A_TABLE] AS a3 ON (a.ID = a3.ID) AND (a.year = a3.year+3) The problem is that, for past years where there is no data (eg, Year-1), I see "#Error" in the appropriate VAL column (eg, [VAL-1]). The weird thing is, I see the expected "null" in the Year column (eg, [YEAR-1]). Some sample data: ID YEAR VAL Dave 2004 1 Dave 2006 2 Dave 2007 3 Dave 2008 5 Dave 2009 0 outputs like this: ID YEAR VAL YEAR-1 VAL-1 YEAR-2 VAL-2 YEAR-3 VAL-3 Dave 2004 1 #Error #Error #Error Dave 2006 2 #Error 2004 1 #Error Dave 2007 3 2006 2 #Error 2004 1 Dave 2008 5 2007 3 2006 2 #Error Dave 2009 0 2008 5 2007 3 2006 2 Does that make sense? Why am I getting the appropriate NULL val for the non-existent YEARs, but an #Error for the non-existent VALs? (This is Access 2000. Conditional statements like "IIf(a1.val is null, -999, a1.val)" do not seem to do anything.) EDIT: It turns out that the errors are somehow caused by the fact that A_TABLE is actually a query. When I put all the data into an actual table and run the same query, everything shows up as it should. Thanks for the help, everyone.

    Read the article

  • WCF cross-domain policy security error

    - by George2
    Hello everyone, I am using VSTS 2008 + C# + WCF + .Net 3.5 + Silverlight 3.0. I host Silverlight control in an html page and debug it from VSTS 2008 (press F5, then run in VSTS 2008 built-in ASP.Net development web server), then call another WCF service (hosted in another machine running IIS 7.0 + Vista). The WCF service is very simple, just return a constant string to client. When invoking the WCF service from Silverlight, I got the following error message, An error occurred while trying to make a request to URI 'https://LabTest/Test.svc'. This could be due to attempting to access a service in a cross-domain way without a proper cross-domain policy in place, or a policy that is unsuitable for SOAP services. You may need to contact the owner of the service to publish a cross-domain policy file and to ensure it allows SOAP-related HTTP headers to be sent. This error may also be caused by using internal types in the web service proxy without using the InternalsVisibleToAttribute attribute. Please see the inner exception for more details. Here is the clientaccesspolicy.xml file, anything wrong? <?xml version="1.0" encoding="utf-8" ?> <access-policy> <cross-domain-access> <policy> <allow-from http-request-headers="*"> <domain uri="*"> </domain> </allow-from> <grant-to> <resource path="/" include-subpaths="true"></resource> </grant-to> </policy> </cross-domain-access> </access-policy> thanks in advance, George

    Read the article

  • navigate all items in a wpf tree view

    - by Brian Leahy
    I want to be able to traverse the visual ui tree looking for an element with an ID bound to the visual element's Tag property. I'm wondering how i do this. Controls don't have children to traverse. I started using LogicalTreeHelper.GetChildren, which seems to work as intended, up until i hit a TreeView control... then LogicalTreeHelper.GetChildren doesnt return any children. Note: the purpose is to find the visual UI element that corresponds to the data item. That is, given an ID of the item, Go find the UI element displaying it. Edit: I am apparently am not explaining this well enough. I am binding some data objects to a TreeView control and then wanting to select a specific item programaticly given that business object's ID. I dont see why it's so hard to travers the visual tree and find the element i want, as the data object's ID is in the Tag property of the appropriate visual element. I'm using Mole and I am able to find the UI element with the appropriate ID in it's Tag. I just cannot find the visual element in code. LogicalTreeHelper does not traverse any items in the tree. Neither does ItemContainerGenerator.ContainerFromItem retrieve anything for items in the tree view.

    Read the article

  • Subversion freaking out on me!

    - by Malfist
    I have two copies of a site, one is the production copy, and the other is the development copy. I recently added everything in the production to a subversion repository hosted on our linux backup server. I created a tag of the current version and I was done. I then copied the development copy overtop of the production copy (on my local machine where I have everything checked out). There are only 10-20 files changed, however, when I use tortoise SVN to do a commit, it says every file has changed. The diff file generated shows subversion removing everything, and replacing it with the new version (which is the exact same). What is going on? How do I fix it? An example diff: Index: C:/Users/jhollon/Documents/Visual Studio 2008/Projects/saloon/trunk/components/index.html =================================================================== --- C:/Users/jhollon/Documents/Visual Studio 2008/Projects/saloon/trunk/components/index.html (revision 5) +++ C:/Users/jhollon/Documents/Visual Studio 2008/Projects/saloon/trunk/components/index.html (working copy) @@ -1,4 +1,4 @@ -<html> -<body bgcolor="#FFFFFF"> -</body> +<html> +<body bgcolor="#FFFFFF"> +</body> </html> \ No newline at end of file

    Read the article

  • Windows Azure: Backup Services Release, Hyper-V Recovery Manager, VM Enhancements, Enhanced Enterprise Management Support

    - by ScottGu
    This morning we released a huge set of updates to Windows Azure.  These new capabilities include: Backup Services: General Availability of Windows Azure Backup Services Hyper-V Recovery Manager: Public preview of Windows Azure Hyper-V Recovery Manager Virtual Machines: Delete Attached Disks, Availability Set Warnings, SQL AlwaysOn Configuration Active Directory: Securely manage hundreds of SaaS applications Enterprise Management: Use Active Directory to Better Manage Windows Azure Windows Azure SDK 2.2: A massive update of our SDK + Visual Studio tooling support All of these improvements are now available to use immediately.  Below are more details about them. Backup Service: General Availability Release of Windows Azure Backup Today we are releasing Windows Azure Backup Service as a general availability service.  This release is now live in production, backed by an enterprise SLA, supported by Microsoft Support, and is ready to use for production scenarios. Windows Azure Backup is a cloud based backup solution for Windows Server which allows files and folders to be backed up and recovered from the cloud, and provides off-site protection against data loss. The service provides IT administrators and developers with the option to back up and protect critical data in an easily recoverable way from any location with no upfront hardware cost. Windows Azure Backup is built on the Windows Azure platform and uses Windows Azure blob storage for storing customer data. Windows Server uses the downloadable Windows Azure Backup Agent to transfer file and folder data securely and efficiently to the Windows Azure Backup Service. Along with providing cloud backup for Windows Server, Windows Azure Backup Service also provides capability to backup data from System Center Data Protection Manager and Windows Server Essentials, to the cloud. All data is encrypted onsite before it is sent to the cloud, and customers retain and manage the encryption key (meaning the data is stored entirely secured and can’t be decrypted by anyone but yourself). Getting Started To get started with the Windows Azure Backup Service, create a new Backup Vault within the Windows Azure Management Portal.  Click New->Data Services->Recovery Services->Backup Vault to do this: Once the backup vault is created you’ll be presented with a simple tutorial that will help guide you on how to register your Windows Servers with it: Once the servers you want to backup are registered, you can use the appropriate local management interface (such as the Microsoft Management Console snap-in, System Center Data Protection Manager Console, or Windows Server Essentials Dashboard) to configure the scheduled backups and to optionally initiate recoveries. You can follow these tutorials to learn more about how to do this: Tutorial: Schedule Backups Using the Windows Azure Backup Agent This tutorial helps you with setting up a backup schedule for your registered Windows Servers. Additionally, it also explains how to use Windows PowerShell cmdlets to set up a custom backup schedule. Tutorial: Recover Files and Folders Using the Windows Azure Backup Agent This tutorial helps you with recovering data from a backup. Additionally, it also explains how to use Windows PowerShell cmdlets to do the same tasks. Below are some of the key benefits the Windows Azure Backup Service provides: Simple configuration and management. Windows Azure Backup Service integrates with the familiar Windows Server Backup utility in Windows Server, the Data Protection Manager component in System Center and Windows Server Essentials, in order to provide a seamless backup and recovery experience to a local disk, or to the cloud. Block level incremental backups. The Windows Azure Backup Agent performs incremental backups by tracking file and block level changes and only transferring the changed blocks, hence reducing the storage and bandwidth utilization. Different point-in-time versions of the backups use storage efficiently by only storing the changes blocks between these versions. Data compression, encryption and throttling. The Windows Azure Backup Agent ensures that data is compressed and encrypted on the server before being sent to the Windows Azure Backup Service over the network. As a result, the Windows Azure Backup Service only stores encrypted data in the cloud storage. The encryption key is not available to the Windows Azure Backup Service, and as a result the data is never decrypted in the service. Also, users can setup throttling and configure how the Windows Azure Backup service utilizes the network bandwidth when backing up or restoring information. Data integrity is verified in the cloud. In addition to the secure backups, the backed up data is also automatically checked for integrity once the backup is done. As a result, any corruptions which may arise due to data transfer can be easily identified and are fixed automatically. Configurable retention policies for storing data in the cloud. The Windows Azure Backup Service accepts and implements retention policies to recycle backups that exceed the desired retention range, thereby meeting business policies and managing backup costs. Hyper-V Recovery Manager: Now Available in Public Preview I’m excited to also announce the public preview of a new Windows Azure Service – the Windows Azure Hyper-V Recovery Manager (HRM). Windows Azure Hyper-V Recovery Manager helps protect your business critical services by coordinating the replication and recovery of System Center Virtual Machine Manager 2012 SP1 and System Center Virtual Machine Manager 2012 R2 private clouds at a secondary location. With automated protection, asynchronous ongoing replication, and orderly recovery, the Hyper-V Recovery Manager service can help you implement Disaster Recovery and restore important services accurately, consistently, and with minimal downtime. Application data in an Hyper-V Recovery Manager scenarios always travels on your on-premise replication channel. Only metadata (such as names of logical clouds, virtual machines, networks etc.) that is needed for orchestration is sent to Azure. All traffic sent to/from Azure is encrypted. You can begin using Windows Azure Hyper-V Recovery today by clicking New->Data Services->Recovery Services->Hyper-V Recovery Manager within the Windows Azure Management Portal.  You can read more about Windows Azure Hyper-V Recovery Manager in Brad Anderson’s 9-part series, Transform the datacenter. To learn more about setting up Hyper-V Recovery Manager follow our detailed step-by-step guide. Virtual Machines: Delete Attached Disks, Availability Set Warnings, SQL AlwaysOn Today’s Windows Azure release includes a number of nice updates to Windows Azure Virtual Machines.  These improvements include: Ability to Delete both VM Instances + Attached Disks in One Operation Prior to today’s release, when you deleted VMs within Windows Azure we would delete the VM instance – but not delete the drives attached to the VM.  You had to manually delete these yourself from the storage account.  With today’s update we’ve added a convenience option that now allows you to either retain or delete the attached disks when you delete the VM:   We’ve also added the ability to delete a cloud service, its deployments, and its role instances with a single action. This can either be a cloud service that has production and staging deployments with web and worker roles, or a cloud service that contains virtual machines.  To do this, simply select the Cloud Service within the Windows Azure Management Portal and click the “Delete” button: Warnings on Availability Sets with Only One Virtual Machine In Them One of the nice features that Windows Azure Virtual Machines supports is the concept of “Availability Sets”.  An “availability set” allows you to define a tier/role (e.g. webfrontends, databaseservers, etc) that you can map Virtual Machines into – and when you do this Windows Azure separates them across fault domains and ensures that at least one of them is always available during servicing operations.  This enables you to deploy applications in a high availability way. One issue we’ve seen some customers run into is where they define an availability set, but then forget to map more than one VM into it (which defeats the purpose of having an availability set).  With today’s release we now display a warning in the Windows Azure Management Portal if you have only one virtual machine deployed in an availability set to help highlight this: You can learn more about configuring the availability of your virtual machines here. Configuring SQL Server Always On SQL Server Always On is a great feature that you can use with Windows Azure to enable high availability and DR scenarios with SQL Server. Today’s Windows Azure release makes it even easier to configure SQL Server Always On by enabling “Direct Server Return” endpoints to be configured and managed within the Windows Azure Management Portal.  Previously, setting this up required using PowerShell to complete the endpoint configuration.  Starting today you can enable this simply by checking the “Direct Server Return” checkbox: You can learn more about how to use direct server return for SQL Server AlwaysOn availability groups here. Active Directory: Application Access Enhancements This summer we released our initial preview of our Application Access Enhancements for Windows Azure Active Directory.  This service enables you to securely implement single-sign-on (SSO) support against SaaS applications (including Office 365, SalesForce, Workday, Box, Google Apps, GitHub, etc) as well as LOB based applications (including ones built with the new Windows Azure AD support we shipped last week with ASP.NET and VS 2013). Since the initial preview we’ve enhanced our SAML federation capabilities, integrated our new password vaulting system, and shipped multi-factor authentication support. We've also turned on our outbound identity provisioning system and have it working with hundreds of additional SaaS Applications: Earlier this month we published an update on dates and pricing for when the service will be released in general availability form.  In this blog post we announced our intention to release the service in general availability form by the end of the year.  We also announced that the below features would be available in a free tier with it: SSO to every SaaS app we integrate with – Users can Single Sign On to any app we are integrated with at no charge. This includes all the top SAAS Apps and every app in our application gallery whether they use federation or password vaulting. Application access assignment and removal – IT Admins can assign access privileges to web applications to the users in their active directory assuring that every employee has access to the SAAS Apps they need. And when a user leaves the company or changes jobs, the admin can just as easily remove their access privileges assuring data security and minimizing IP loss User provisioning (and de-provisioning) – IT admins will be able to automatically provision users in 3rd party SaaS applications like Box, Salesforce.com, GoToMeeting, DropBox and others. We are working with key partners in the ecosystem to establish these connections, meaning you no longer have to continually update user records in multiple systems. Security and auditing reports – Security is a key priority for us. With the free version of these enhancements you'll get access to our standard set of access reports giving you visibility into which users are using which applications, when they were using them and where they are using them from. In addition, we'll alert you to un-usual usage patterns for instance when a user logs in from multiple locations at the same time. Our Application Access Panel – Users are logging in from every type of devices including Windows, iOS, & Android. Not all of these devices handle authentication in the same manner but the user doesn't care. They need to access their apps from the devices they love. Our Application Access Panel will support the ability for users to access access and launch their apps from any device and anywhere. You can learn more about our plans for application management with Windows Azure Active Directory here.  Try out the preview and start using it today. Enterprise Management: Use Active Directory to Better Manage Windows Azure Windows Azure Active Directory provides the ability to manage your organization in a directory which is hosted entirely in the cloud, or alternatively kept in sync with an on-premises Windows Server Active Directory solution (allowing you to seamlessly integrate with the directory you already have).  With today’s Windows Azure release we are integrating Windows Azure Active Directory even more within the core Windows Azure management experience, and enabling an even richer enterprise security offering.  Specifically: 1) All Windows Azure accounts now have a default Windows Azure Active Directory created for them.  You can create and map any users you want into this directory, and grant administrative rights to manage resources in Windows Azure to these users. 2) You can keep this directory entirely hosted in the cloud – or optionally sync it with your on-premises Windows Server Active Directory.  Both options are free.  The later approach is ideal for companies that wish to use their corporate user identities to sign-in and manage Windows Azure resources.  It also ensures that if an employee leaves an organization, his or her access control rights to the company’s Windows Azure resources are immediately revoked. 3) The Windows Azure Service Management APIs have been updated to support using Windows Azure Active Directory credentials to sign-in and perform management operations.  Prior to today’s release customers had to download and use management certificates (which were not scoped to individual users) to perform management operations.  We still support this management certificate approach (don’t worry – nothing will stop working).  But we think the new Windows Azure Active Directory authentication support enables an even easier and more secure way for customers to manage resources going forward.  4) The Windows Azure SDK 2.2 release (which is also shipping today) includes built-in support for the new Service Management APIs that authenticate with Windows Azure Active Directory, and now allow you to create and manage Windows Azure applications and resources directly within Visual Studio using your Active Directory credentials.  This, combined with updated PowerShell scripts that also support Active Directory, enables an end-to-end enterprise authentication story with Windows Azure. Below are some details on how all of this works: Subscriptions within a Directory As part of today’s update, we have associated all existing Window Azure accounts with a Windows Azure Active Directory (and created one for you if you don’t already have one). When you login to the Windows Azure Management Portal you’ll now see the directory name in the URI of the browser.  For example, in the screen-shot below you can see that I have a “scottgu” directory that my subscriptions are hosted within: Note that you can continue to use Microsoft Accounts (formerly known as Microsoft Live IDs) to sign-into Windows Azure.  These map just fine to a Windows Azure Active Directory – so there is no need to create new usernames that are specific to a directory if you don’t want to.  In the scenario above I’m actually logged in using my @hotmail.com based Microsoft ID which is now mapped to a “scottgu” active directory that was created for me.  By default everything will continue to work just like you used to before. Manage your Directory You can manage an Active Directory (including the one we now create for you by default) by clicking the “Active Directory” tab in the left-hand side of the portal.  This will list all of the directories in your account.  Clicking one the first time will display a getting started page that provides documentation and links to perform common tasks with it: You can use the built-in directory management support within the Windows Azure Management Portal to add/remove/manage users within the directory, enable multi-factor authentication, associate a custom domain (e.g. mycompanyname.com) with the directory, and/or rename the directory to whatever friendly name you want (just click the configure tab to do this).  You can also setup the directory to automatically sync with an on-premises Active Directory using the “Directory Integration” tab. Note that users within a directory by default do not have admin rights to login or manage Windows Azure based resources.  You still need to explicitly grant them co-admin permissions on a subscription for them to login or manage resources in Windows Azure.  You can do this by clicking the Settings tab on the left-hand side of the portal and then by clicking the administrators tab within it. Sign-In Integration within Visual Studio If you install the new Windows Azure SDK 2.2 release, you can now connect to Windows Azure from directly inside Visual Studio without having to download any management certificates.  You can now just right-click on the “Windows Azure” icon within the Server Explorer and choose the “Connect to Windows Azure” context menu option to do so: Doing this will prompt you to enter the email address of the username you wish to sign-in with (make sure this account is a user in your directory with co-admin rights on a subscription): You can use either a Microsoft Account (e.g. Windows Live ID) or an Active Directory based Organizational account as the email.  The dialog will update with an appropriate login prompt depending on which type of email address you enter: Once you sign-in you’ll see the Windows Azure resources that you have permissions to manage show up automatically within the Visual Studio server explorer and be available to start using: No downloading of management certificates required.  All of the authentication was handled using your Windows Azure Active Directory! Manage Subscriptions across Multiple Directories If you have already have multiple directories and multiple subscriptions within your Windows Azure account, we have done our best to create a good default mapping of your subscriptions->directories as part of today’s update.  If you don’t like the default subscription-to-directory mapping we have done you can click the Settings tab in the left-hand navigation of the Windows Azure Management Portal and browse to the Subscriptions tab within it: If you want to map a subscription under a different directory in your account, simply select the subscription from the list, and then click the “Edit Directory” button to choose which directory to map it to.  Mapping a subscription to a different directory takes only seconds and will not cause any of the resources within the subscription to recycle or stop working.  We’ve made the directory->subscription mapping process self-service so that you always have complete control and can map things however you want. Filtering By Directory and Subscription Within the Windows Azure Management Portal you can filter resources in the portal by subscription (allowing you to show/hide different subscriptions).  If you have subscriptions mapped to multiple directory tenants, we also now have a filter drop-down that allows you to filter the subscription list by directory tenant.  This filter is only available if you have multiple subscriptions mapped to multiple directories within your Windows Azure Account:   Windows Azure SDK 2.2 Today we are also releasing a major update of our Windows Azure SDK.  The Windows Azure SDK 2.2 release adds some great new features including: Visual Studio 2013 Support Integrated Windows Azure Sign-In support within Visual Studio Remote Debugging Cloud Services with Visual Studio Firewall Management support within Visual Studio for SQL Databases Visual Studio 2013 RTM VM Images for MSDN Subscribers Windows Azure Management Libraries for .NET Updated Windows Azure PowerShell Cmdlets and ScriptCenter I’ll post a follow-up blog shortly with more details about all of the above. Additional Updates In addition to the above enhancements, today’s release also includes a number of additional improvements: AutoScale: Richer time and date based scheduling support (set different rules on different dates) AutoScale: Ability to Scale to Zero Virtual Machines (very useful for Dev/Test scenarios) AutoScale: Support for time-based scheduling of Mobile Service AutoScale rules Operation Logs: Auditing support for Service Bus management operations Today we also shipped a major update to the Windows Azure SDK – Windows Azure SDK 2.2.  It has so much goodness in it that I have a whole second blog post coming shortly on it! :-) Summary Today’s Windows Azure release enables a bunch of great new scenarios, and enables a much richer enterprise authentication offering. If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

< Previous Page | 571 572 573 574 575 576 577 578 579 580 581 582  | Next Page >