Search Results

Search found 23783 results on 952 pages for 'edit and continue'.

Page 457/952 | < Previous Page | 453 454 455 456 457 458 459 460 461 462 463 464  | Next Page >

  • Unable to extend desktop

    - by CSharperWithJava
    I'm trying to hook up my TV to my computer as a gaming/multimedia center but I'm having troubles setting it up. I have a custom built machine running Windows 7 RC. It has an ATI Radeon 4800 video card with 2 dvi output and 1 S-video output. I have an s-video to composite adapter that connects to my tv. (It's an old TV with only Cable, composite, and s-video connections). I can switch the desktop to my TV without a problem, but I can't duplicate or extend my desktop onto it. I've installed the latest drivers and Catalyst Control Center, but it won't let it work any more readily than Windows would. Any suggestions? Would using an s-video cable instead of the adapter change anything? (The only reason I use the adapter is because it came with the graphics card) (Edit) I installed the latest drivers and I can now duplicate the screen (show on one monitor and on the TV), but I still can't extend the desktop.

    Read the article

  • How views are changing in future versions of SQL

    - by Rob Farley
    April is here, and this weekend, SQL v11.0 (previous known as Denali, now known as SQL Server 2012) reaches general availability. And so I thought I’d share some news about what’s coming next. I didn’t hear this at the MVP Summit earlier this year (where there was lots of NDA information given, but I didn’t go), so I think I’m free to share it. I’ve written before about CTEs being query-scoped views. Well, the actual story goes a bit further, and will continue to develop in future versions. A CTE is a like a “temporary temporary view”, scoped to a single query. Due to globally-scoped temporary objects using a two-hashes naming style, and session-scoped (or ‘local’) temporary objects a one-hash naming style, this query-scoped temporary object uses a cunning zero-hash naming style. We see this implied in Books Online in the CREATE TABLE page, but as we know, temporary views are not yet supported in the SQL Server. However, in a breakaway from ANSI-SQL, Microsoft is moving towards consistency with their naming. We know that a CTE is a “common table expression” – this is proving to be a more strategic than you may have appreciated. Within the Microsoft product group, the term “Table Expression” is far more widely used than just CTEs. Anything that can be used in a FROM clause is referred to as a Table Expression, so long as it doesn’t actually store data (which would make it a Table, rather than a Table Expression). You can see this is not just restricted to the product group by doing an internet search for how the term is used without ‘common’. In the past, Books Online has referred to a view as a “virtual table” (but notice that there is no SQL 2012 version of this page). However, it was generally decided that “virtual table” was a poor name because it wasn’t completely accurate, and it’s typically accepted that virtualisation and SQL is frowned upon. That page I linked to says “or stored query”, which is slightly better, but when the SQL 2012 version of that page is actually published, the line will be changed to read: “A view is a stored table expression (STE)”. This change will be the first of many. During the SQL 2012 R2 release, the keyword VIEW will become deprecated (this will be SQL v11 SP1.5). Three versions later, in SQL 14.5, you will need to be in compatibility mode 140 to allow “CREATE VIEW” to work. Also consistent with Microsoft’s deprecation policy, the execution of any query that refers to an object created as a view (rather than the new “CREATE STE”), will cause a Deprecation Event to fire. This will all be in preparation for the introduction of Single-Column Table Expressions (to be introduced in SQL 17.3 SP6) which will finally shut up those people waiting for a decent implementation of Inline Scalar Functions. And of course, CTEs are “Common” because the Table Expression definition needs to be repeated over and over throughout a stored procedure. ...or so I think I heard at some point. Oh, and congratulations to all the new MVPs on this April 1st. @rob_farley

    Read the article

  • Rolling Along: PASS Board Year 2, Q2

    - by Denise McInerney
    Eighteen months into my time as a PASS Director I’m especially proud of what the Virtual Chapters have accomplished and want to share that progress with you. I'm also pleased that the organization has invested more resources to support the VCs. In this quarter I got to attend two conferences and meet more members of the SQL community. Virtual Chapters In the first six months of 2013 VCs have hosted more than 50 webinars, offering free technical education to over 6200 attendees. This is a great benefit to PASS members; thanks to the VC leaders, volunteers and speakers who contribute their time to produce these events. The Performance VC held their “Summer Performance Palooza”, an event featuring eight back-to-back sessions. Links to the session recordings can be found on the VCs web site. The new webinar platform, GoToWebinar, has been rolled out to all the VCs. This is a more stable, scalable platform and represents an important investment into the future of the VCs. A few new VCs are in the planning stages, including one focused on Security and one for Russian speakers. Visit the Virtual Chapter home page to sign up for the chapters that interest you. Each Virtual Chapter is offering a discount code for PASS Summit 2013. Be sure to ask your VC leader for the code to save $200 on Summit registration. 24 Hours of PASS The next 24HOP will be on July 31. This Summit Preview edition will feature 24 consecutive webcasts presented by experts who will be speaking at Summit in October. Registration for this free event is open now. And we will be using the GoToWebinar platform for 24HOP also. Business Analytics Conference April marked the first PASS Business Analytics Conference in Chicago. This introduced PASS to another segment of data professionals: the analysts and data scientists who work with the world’s growing collection of data. Overall the inaugural event was a success and gave us a glimpse into this increasingly important space. After Chicago the Board had several serious discussions about the lessons learned from this seven and what we should do next. We agreed to apply those lessons and continue to invest in this event; there will be a PASS Business Analytics Conference in 2014. I’m very pleased the next event will be in San Jose, CA, the heart of Silicon Valley, a place where a great deal of investment and innovation in data analytics is taking place. Global SQL Community Over the last couple of years PASS has been taking steps to become more relevant to SQL communities in different parts of the world. In May I had the opportunity to attend SQL Bits XI in Nottingham, England. It was enlightening to meet and talk with SQL professionals from around the U.K. as well as many other European countries. The many SQL Bits volunteers put on a great event and were gracious hosts. Budgets The Board passed the FY14 budget at the end of June. The  budget process can be challenging and requires the Board to make some difficult choices about where to allocate resources. Overall I’m satisfied with the decisions we made and think we are investing in the right activities and programs. Next Up The Board is meeting July 18-19 in Kansas City. We will be holding the Executive Committee election for the Exec Co that will take office in 2014. We will also be discussing plans for the next BA conference as well as the next steps for our Global Growth initiative. Applications for the upcoming Board of Directors election open on July 24. If you are considering running for the Board you can visit the PASS elections site to learn more about the election process. And I encourage anyone considering running to reach out to current and past Board members to learn about what the role entails. Plans for the next PASS Summit are in full swing. We are working on some fun new ideas to introduce attendees to the many ways to become involved in the SQL community.

    Read the article

  • How to block own rpcap traffic where tshark is running?

    - by Pankaj Goyal
    Platform :- Fedora 13 32-bit machine RemoteMachine$ ./rpcapd -n ClientMachine$ tshark -w "filename" -i "any interface name" As soon as capture starts without any capture filter, thousands of packets get captured. Rpcapd binds to 2002 port by default and while establishing the connection it sends a randomly chosen port number to the client for further communication. Both client and server machines exchange tcp packets through randomly chosen ports. So, I cannot even specify the capture filter to block this rpcap related tcp traffic. Wireshark & tshark for Windows have an option "Do not capture own Rpcap Traffic" in Remote Settings in Edit Interface Dialog box. But there is no such option in tshark for linux. It will be also better if anyone can tell me how wireshark blocks rpcap traffic....

    Read the article

  • How to write in a <array><dict> structure with defaults write?

    - by Hedge
    I've got a .plist-file with a structure like this: <plist version="1.0"> <array> <dict> <key>BundleIsVersionChecked</key> <false/> <key>BundleIsRelocatable</key> <false/> <key>BundleHasStrictIdentifier</key> <false/> <key>RootRelativeBundlePath</key> <string>value</string> </dict> </array> </plist> I want to add or edit the RootRelativeBundlePath-key with the defaults write command. Another possibility would be writing the whole plist-file but it has to be the same exact structure. How can I do this?

    Read the article

  • Does an unmanaged 4/8-port GBit Ethernet switch with a GBIC port exist?

    - by Aaron Digulla
    I'm looking for a simple unmanaged switch with 4-8 GBit Ethernet ports and a fiber port (either as a GBIC slot or pre-installed with a 1000BASE-SX port). Does something like that exist? [EDIT] I want to connect to places in my home without drilling large holes though the floors. Therefore, I'm looking for a cheap way to connect two GBit switches via fiber. I tried with a media converter (GBit <- multimode fiber) but that costs about 50% throughput. So I was hoping that there is a cheap, small GBit switch which has a GBIC slot). All I found so far are very expensive managed switches with 12 or 24 ports for industry use.

    Read the article

  • Cropping a PDF File's Margin During Printing

    - by JavaMan
    I'm using the free Acrobat Reader to print out some pdf documents having very large top/bottom/left/right margins. I want to remove the margins (which are wasting too much space and making the fonts too small). I used to use Acrobat (the paid version having edit features) to crop the src pdf file manually. But since it is an old version it does not support new pdf format and I don't want to upgrade for such a simple use. Is there any free way to crop/remove unwanted white margins from the printed pdf? I am thinking to print the pdf files to a PDF Printer like the Bullzip PDF Printer and enlarge the output file manually so as to remove any white margin. But there does not seem to be such a feature in Bullzip PDF Printer. Is there any other virtual printer software that can be used for this purpose?

    Read the article

  • Oracle Fusion Supply Chain Management (SCM) Designs May Improve End User Productivity

    - by Applications User Experience
    By Applications User Experience on March 10, 2011 Michele Molnar, Senior Usability Engineer, Applications User Experience The Challenge: The SCM User Experience team, in close collaboration with product management and strategy, completely redesigned the user experience for Oracle Fusion applications. One of the goals of this redesign was to increase end user productivity by applying design patterns and guidelines and incorporating findings from extensive usability research. But a question remained: How do we know that the Oracle Fusion designs will actually increase end user productivity? The Test: To answer this question, the SCM Usability Engineers compared Oracle Fusion designs to their corresponding existing Oracle applications using the workflow time analysis method. The workflow time analysis method breaks tasks into a sequence of operators. By applying standard time estimates for all of the operators in the task, an estimate of the overall task time can be calculated. The workflow time analysis method has been recently adopted by the Applications User Experience group for use in predicting end user productivity. Using this method, a design can be tested and refined as needed to improve productivity even before the design is coded. For the study, we selected some of our recent designs for Oracle Fusion Product Information Management (PIM). The designs encompassed tasks performed by Product Managers to create, manage, and define products for their organization. (See Figure 1 for an example.) In applying this method, the SCM Usability Engineers collaborated with Product Management to compare the new Oracle Fusion Applications designs against Oracle’s existing applications. Together, we performed the following activities: Identified the five most frequently performed tasks Created detailed task scenarios that provided the context for each task Conducted task walkthroughs Analyzed and documented the steps and flow required to complete each task Applied standard time estimates to the operators in each task to estimate the overall task completion time Figure 1. The interactions on each Oracle Fusion Product Information Management screen were documented, as indicated by the red highlighting. The task scenario and script provided the context for each task.  The Results: The workflow time analysis method predicted that the Oracle Fusion Applications designs would result in productivity gains in each task, ranging from 8% to 62%, with an overall productivity gain of 43%. All other factors being equal, the new designs should enable these tasks to be completed in about half the time it takes with existing Oracle Applications. Further analysis revealed that these performance gains would be achieved by reducing the number of clicks and screens needed to complete the tasks. Conclusions: Using the workflow time analysis method, we can expect the Oracle Fusion Applications redesign to succeed in improving end user productivity. The workflow time analysis method appears to be an effective and efficient tool for testing, refining, and retesting designs to optimize productivity. The workflow time analysis method does not replace usability testing with end users, but it can be used as an early predictor of design productivity even before designs are coded. We are planning to conduct usability tests later in the development cycle to compare actual end user data with the workflow time analysis results. Such results can potentially be used to validate the productivity improvement predictions. Used together, the workflow time analysis method and usability testing will enable us to continue creating, evaluating, and delivering Oracle Fusion designs that exceed the expectations of our end users, both in the quality of the user experience and in productivity. (For more information about studying productivity, refer to the Measuring User Productivity blog.)

    Read the article

  • nvidia graphics resolution problem

    - by Deepak Adhikari
    I am currently using ubuntu 12.04 I have acer aspire timelinex 3830tg with 2GB nvidia GeForce GT540M graphics card To enable my graphics card I followed following steps. 1.) I activated nvidia_current and nvidia_current_updates from additional drivers 2.) sudo nvidia-xconfig 3.) then reboot Following these steps I got following errors 1.) my resolution is 640x480...(there is no option of 1366x768 in display...previously there was 1366x768 when nvidia-xconfig command was not entered) 2.) when I open nvidia-settings it shows me following error "You do not appear to be using the NVIDIA X driver. Please edit your X configuration file (just run 'nvidia-xconfig' as root) and restart the X server." Problem need to be solved 1.) Change resolution to 1366x768 2.) Also how to check my nvidia graphics working or not Please some one please help me to solve these issues...I am seriously in need of my graphics card... I wan't my nvidia graphics card work as my intel graphics smoothly I am not willing to use bumblebee with regards, ubuntu user

    Read the article

  • How to Archive, Search, and View Your Tweet Statistics with ThinkUp

    - by YatriTrivedi
    Worried about archiving your tweets? Want a more powerful search? Want to see your tweet statistics? You can do all of that and more by installing ThinkUp on your home server. ThinkUp is a brilliant application (currently in beta) that will archive all of your tweets, your replies, responses, etc. so that you can search through them and find out some helpful usage statistics. It has quite a few plugins, including one that adds full Facebook support, too. It’s designed to be installed on a LAMP server; that is, Linux, Apache, MySQL, and PHP is what will provide the backbone for it. While it’s possible to install it on a Windows- or Mac-based machine, it’s most easily handled in Linux, so we’ll be using Ubuntu to show you how to get it up and running. It’s in very active development by the founder, Gina Trapani, and by many users in the community Latest Features How-To Geek ETC How to Recover that Photo, Picture or File You Deleted Accidentally How To Colorize Black and White Vintage Photographs in Photoshop How To Get SSH Command-Line Access to Windows 7 Using Cygwin The How-To Geek Video Guide to Using Windows 7 Speech Recognition How To Create Your Own Custom ASCII Art from Any Image How To Process Camera Raw Without Paying for Adobe Photoshop What is the Internet? From the Today Show January 1994 [Historical Video] Take Screenshots and Edit Them in Chrome and Iron Using Aviary Screen Capture Run Android 3.0 on a Hacked Nook Google Art Project Takes You Inside World Famous Museums Emerald Waves and Moody Skies Wallpaper Change Your MAC Address to Avoid Free Internet Restrictions

    Read the article

  • Googlebot visit but no cache update - why?

    - by Mick
    I have made a new plain vanilla HTML website. I have been making regular modifications to it on an almost daily basis. The site is hosted by hostmonster and as part of their service they offer "awstats" to let you know assorted details of visitors to the site. One thing is puzzling me. According to awstats, a "robot/spider" calling itself "Googlebot" visited my site as recently as today (28th June 2011), but when I find my site on google (e.g. by searching for "full reserve banking") the cache is dated only the 5th June. I always thought that a visit from the google robot was synonymous with a cache update. Am I wrong? Or have I accidentally put something in the site telling google that nothing has been updated? EDIT: It seems a moderator has removed the name of my website, so there is now no chance that anyone could check out if I had made some error on my site :-( ... but anyway, in answer to paulmorriss' question, here is what aw stats was telling me:

    Read the article

  • Is there any research out there on geographic differences in work environments (e.g., respect) for programmers?

    - by Ethel Evans
    One thing I've learned from this website is that software developers are not treated the same as what I've seen in the companies I've worked at, and some of the differences seem to be related to the culture or other factors of the geographical location where the programmer works. In some areas, it seems like programmers can expect many perks and a great deal of professional respect, but in others it sounds like programmers are seen as laborers who are told what to do and then should go do it without question. Even in just the USA, there seem to be major differences in "the norm" between the various regions of this country. I'm wondering how much of this is just my perception, and how much is real differences about how programmers are perceived in their different locations. Is there any research out there discussing major differences in programmer work environments or attitudes about how to treat or respect programmers by geography? I'd be interested in multiple articles tackling different ways of looking at this. Edit: Research, specifically, doesn't seem to be available, so I'm making the question broader. Any good, thoughtful writing on the topic of any kind available?

    Read the article

  • How to remove synaptic without installing all the unwanted packages?

    - by Jay
    I am trying to uninstall synaptic. I prefer using apt-get and other command line tools to manage my packages. So I do not need synaptic and the software manager. I'm trying to remove both of them using apt-get. Its a new box. Recently installed Linux Mint mate 15. After installation, the only thing I did was, sudo apt-get update and sudo apt-get dist-upgrade After that, I did this command for removing synaptic, sudo apt-get remove --purge synaptic But this gives me a very weird output, Reading package lists... Done Building dependency tree Reading state information... Done The following packages were automatically installed and are no longer required: apturl-kde icoutils kate-data katepart kde-runtime kde-runtime-data kdelibs-bin kdelibs5-data kdelibs5-plugins kdesudo kdoctools kubuntu-debug-installer libattica0.4 libdlrestrictions1 libkactivities-bin libkactivities-models1 libkactivities6 libkatepartinterfaces4 libkcmutils4 libkde3support4 libkdeclarative5 libkdecore5 libkdesu5 libkdeui5 libkdewebkit5 libkdnssd4 libkemoticons4 libkfile4 libkhtml5 libkidletime4 libkio5 libkjsapi4 libkjsembed4 libkmediaplayer4 libknewstuff3-4 libknotifyconfig4 libkntlm4 libkparts4 libkpty4 libkrosscore4 libktexteditor4 libkxmlrpcclient4 libnepomuk4 libnepomukcore4abi1 libnepomukquery4a libnepomukutils4 libntrack-qt4-1 libntrack0 libphonon4 libplasma3 libpolkit-qt-1-1 libpoppler-qt4-4 libqapt2 libqapt2-runtime libqca2 libqt4-qt3support libsolid4 libsoprano4 libstreamanalyzer0 libstreams0 libthreadweaver4 libvirtodbc0 nepomuk-core nepomuk-core-data ntrack-module-libnl-0 odbcinst odbcinst1debian2 oxygen-icon-theme phonon phonon-backend-gstreamer plasma-scriptengine-javascript qapt-batch shared-desktop-ontologies soprano-daemon virtuoso-minimal virtuoso-opensource-6.1-bin virtuoso-opensource-6.1-common Use 'apt-get autoremove' to remove them. The following extra packages will be installed: apturl-kde icoutils kate-data katepart kde-runtime kde-runtime-data kdelibs-bin kdelibs5-data kdelibs5-plugins kdesudo kdoctools kubuntu-debug-installer libattica0.4 libdlrestrictions1 libkactivities-bin libkactivities-models1 libkactivities6 libkatepartinterfaces4 libkcmutils4 libkde3support4 libkdeclarative5 libkdecore5 libkdesu5 libkdeui5 libkdewebkit5 libkdnssd4 libkemoticons4 libkfile4 libkhtml5 libkidletime4 libkio5 libkjsapi4 libkjsembed4 libkmediaplayer4 libknewstuff3-4 libknotifyconfig4 libkntlm4 libkparts4 libkpty4 libkrosscore4 libktexteditor4 libkxmlrpcclient4 libnepomuk4 libnepomukcore4abi1 libnepomukquery4a libnepomukutils4 libntrack-qt4-1 libntrack0 libphonon4 libplasma3 libpolkit-qt-1-1 libpoppler-qt4-4 libqapt2 libqapt2-runtime libqca2 libqt4-qt3support libsolid4 libsoprano4 libstreamanalyzer0 libstreams0 libthreadweaver4 libvirtodbc0 libxml2-utils nepomuk-core nepomuk-core-data ntrack-module-libnl-0 odbcinst odbcinst1debian2 oxygen-icon-theme phonon phonon-backend-gstreamer plasma-scriptengine-javascript qapt-batch shared-desktop-ontologies soprano-daemon virtuoso-minimal virtuoso-opensource-6.1-bin virtuoso-opensource-6.1-common Suggested packages: libterm-readline-gnu-perl libterm-readline-perl-perl djvulibre-bin finger hspell libqca2-plugin-cyrus-sasl libqca2-plugin-gnupg libqca2-plugin-ossl phonon-backend-vlc phonon-backend-xine phonon-backend-mplayer The following packages will be REMOVED: aptoncd* apturl* mintupdate* mintwelcome* synaptic* The following NEW packages will be installed: apturl-kde icoutils kate-data katepart kde-runtime kde-runtime-data kdelibs-bin kdelibs5-data kdelibs5-plugins kdesudo kdoctools kubuntu-debug-installer libattica0.4 libdlrestrictions1 libkactivities-bin libkactivities-models1 libkactivities6 libkatepartinterfaces4 libkcmutils4 libkde3support4 libkdeclarative5 libkdecore5 libkdesu5 libkdeui5 libkdewebkit5 libkdnssd4 libkemoticons4 libkfile4 libkhtml5 libkidletime4 libkio5 libkjsapi4 libkjsembed4 libkmediaplayer4 libknewstuff3-4 libknotifyconfig4 libkntlm4 libkparts4 libkpty4 libkrosscore4 libktexteditor4 libkxmlrpcclient4 libnepomuk4 libnepomukcore4abi1 libnepomukquery4a libnepomukutils4 libntrack-qt4-1 libntrack0 libphonon4 libplasma3 libpolkit-qt-1-1 libpoppler-qt4-4 libqapt2 libqapt2-runtime libqca2 libqt4-qt3support libsolid4 libsoprano4 libstreamanalyzer0 libstreams0 libthreadweaver4 libvirtodbc0 libxml2-utils nepomuk-core nepomuk-core-data ntrack-module-libnl-0 odbcinst odbcinst1debian2 oxygen-icon-theme phonon phonon-backend-gstreamer plasma-scriptengine-javascript qapt-batch shared-desktop-ontologies soprano-daemon virtuoso-minimal virtuoso-opensource-6.1-bin virtuoso-opensource-6.1-common 0 upgraded, 78 newly installed, 5 to remove and 0 not upgraded. Need to get 60.9 MB of archives. After this operation, 146 MB of additional disk space will be used. Do you want to continue [Y/n]? n Abort. As you can see, apt-get is trying to install the same packages that it is asking me to autoremove. Could someone please tell me, how to uninstall synaptic properly? Or am I missing something? Just for the record, I also did, sudo apt-get autoremove --purge like it asked me to ... and this is what I got, Reading package lists... Done Building dependency tree Reading state information... Done 0 upgraded, 0 newly installed, 0 to remove and 6 not upgraded.

    Read the article

  • WLS Console Timeout

    - by john.graves(at)oracle.com
    The WebLogic console timeout is a great feature for security, yet a horrible feature during development.  Logging in over and over again gets to be annoying.  This is very easy to change, but I would never do this on a production system!   Find the WebLogic consoleapp weblogic.xml file.  This is typically in your WL_HOME/server/lib/consoleapp/webapp/WEB-INF/ directory. Edit the weblogic.xml file: Update the section shown and increase the timeout-secs.  I just throw an extra zero at the end giving me ten full hours of fun!!!: <session-descriptor> <timeout-secs>36000</timeout-secs> <invalidation-interval-secs>60</invalidation-interval-secs> <cookie-name>ADMINCONSOLESESSION</cookie-name> <cookie-max-age-secs>-1</cookie-max-age-secs> <cookie-http-only>false</cookie-http-only> <url-rewriting-enabled>false</url-rewriting-enabled> </session-descriptor> .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; }

    Read the article

  • Error installing Windows7 64 bits on VirtualBox

    - by MetaDark
    I am trying to set up Windows in Virtual Box, so I don't need to reboot in the rare occasion that I actually need it. The problem is, Virtual Box doesn't preform any errors when I insert the 32bit installation CD but when I try to use the 64bit installation; What!? I am already using the installation disc! I've checked my BIOS to see if I have SVM (AMD's version of VT) disabled and all I see is "Enabled" I have a K9N6PGM2-V2 motherboard A Triple Core AMD Athlon II A Nvdia NForce 430 integrated graphics card 4GB of RAM An 80GB IDE And a 1TB SATA I don't think the last three specifications matter but just in case XP I am pretty sure the CD isn't broken ( I am going to make sure in just a moment ), what could be the cause to this problem? Edit: The 64bit installation CD is not broken, but I found out when trying to install from the 32bit version that it's trying to upgrade, not preform a fresh install - Odd.

    Read the article

  • Stepping outside Visual Studio IDE [Part 2 of 2] with Mono 2.6.4

    - by mbcrump
    Continuing part 2 of my Stepping outside the Visual Studio IDE, is the open-source Mono Project. Mono is a software platform designed to allow developers to easily create cross platform applications. Sponsored by Novell (http://www.novell.com/), Mono is an open source implementation of Microsoft's .NET Framework based on the ECMA standards for C# and the Common Language Runtime. A growing family of solutions and an active and enthusiastic contributing community is helping position Mono to become the leading choice for development of Linux applications. So, to clarify. You can use Mono to develop .NET applications that will run on Linux, Windows or Mac. It’s basically a IDE that has roots in Linux. Let’s first look at the compatibility: Compatibility If you already have an application written in .Net, you can scan your application with the Mono Migration Analyzer (MoMA) to determine if your application uses anything not supported by Mono. The current release version of Mono is 2.6. (Released December 2009) The easiest way to describe what Mono currently supports is: Everything in .NET 3.5 except WPF and WF, limited WCF. Here is a slightly more detailed view, by .NET framework version: Implemented C# 3.0 System.Core LINQ ASP.Net 3.5 ASP.Net MVC C# 2.0 (generics) Core Libraries 2.0: mscorlib, System, System.Xml ASP.Net 2.0 - except WebParts ADO.Net 2.0 Winforms/System.Drawing 2.0 - does not support right-to-left C# 1.0 Core Libraries 1.1: mscorlib, System, System.Xml ASP.Net 1.1 ADO.Net 1.1 Winforms/System.Drawing 1.1 Partially Implemented LINQ to SQL - Mostly done, but a few features missing WCF - silverlight 2.0 subset completed Not Implemented WPF - no plans to implement WF - Will implement WF 4 instead on future versions of Mono. System.Management - does not map to Linux System.EnterpriseServices - deprecated Links to documentation. The Official Mono FAQ’s Links to binaries. Mono IDE Latest Version is 2.6.4 That's it, nothing more is required except to compile and run .net code in Linux. Installation After landing on the mono project home page, you can select which platform you want to download. I typically pick the Virtual PC image since I spend all of my day using Windows 7. Go ahead and pick whatever version is best for you. The Virtual PC image comes with Suse Linux. Once the image is launch, you will see the following: I’m not going to go through each option but its best to start with “Start Here” icon. It will provide you with information on new projects or existing VS projects. After you get Mono installed, it's probably a good idea to run a quick Hello World program to make sure everything is setup properly. This allows you to know that your Mono is working before you try writing or running a more complex application. To write a "Hello World" program follow these steps: Start Mono Development Environment. Create a new Project: File->New->Solution Select "Console Project" in the category list. Enter a project name into the Project name field, for example, "HW Project". Click "Forward" Click “Packaging” then OK. You should have a screen very simular to a VS Console App. Click the "Run" button in the toolbar (Ctrl-F5). Look in the Application Output and you should have the “Hello World!” Your screen should look like the screen below. That should do it for a simple console app in mono. To test out an ASP.NET application, simply copy your code to a new directory in /srv/www/htdocs, then visit the following URL: http://localhost/directoryname/page.aspx where directoryname is the directory where you deployed your application and page.aspx is the initial page for your software. Databases You can continue to use SQL server database or use MySQL, Postgress, Sybase, Oracle, IBM’s DB2 or SQLite db. Conclusion I hope this brief look at the Mono IDE helps someone get acquainted with development outside of VS. As always, I welcome any suggestions or comments.

    Read the article

  • Bash script runs fine, but not in cron

    - by radiotech
    I have a script that's supposed to record a shoutcast stream for an hour, convert it to mp3, and then save it. The script runs correctly when I run it from the terminal, but I can't seem to get it to run in cron (where it should run every hour at the top of the hour). Here's the line in crontab: 0 * * * * /medialib/tech/bin/recordstream 2>&1 >> /medialib/tech/cron.log and here's the script: #!/bin/bash name="$(date +%s)" mp3_name=$name.mp3 wav_name=$name.wav timeout -sHUP 60m vlc -I dummy --sout "#transcode{channels=2}:std{access=file,mux=wav,dst=/medialib/stream_backup/wav/$wav_name" /medialib/tech/lib/listen.m3u lame --mp3input /medialib/stream_backup/wav/$wav_name /medialib/stream_backup/$mp3_name rm /medialib/stream_backup/wav/$wav_name Thank you! EDIT: Contents of cron.log (This text has been in the log file since it was transferred from an old server where it was working). VLC media player 2.0.8 Twoflower Command Line Interface initialized. Type `help' for help. > Shutting down. VLC media player 2.0.8 Twoflower Command Line Interface initialized. Type `help' for help. > Shutting down.

    Read the article

  • 3 Monitors 2 different graphic cards

    - by Michael
    I am looking to add a third monitor to my workstation. I currently have 2 monitors connected to a geforce gt 120. I am wanting to find the cheapest solution on adding another monitor, I have a old ati x300 laying around and wanting to know if I would be able to use that to power the other monitor and if it would conflict with my other graphic card (I am not a gamer). What I want to do is have the at x300 power 1 monitor via dvi and the gt 120 power 2 monitors via vga and dvi. Thanks edit: im running windows 7 64bit

    Read the article

  • Transform coordinates from 3d to 2d without matrix or built in methods

    - by Thomas
    Not to long ago i started to create a small 3D engine in javascript to combine this with an html5 canvas. One of the issues I run into is how can you transform 3d to 2d coords. Since I cannot use matrices or built in transformation methods I need another way. I've tried implementing the next explanation + pseudo code: http://freespace.virgin.net/hugo.elias/routines/3d_to_2d.htm Unfortunately no luck there. I've replace all the input variables with data from my own camera and object classes. I have the following data: An object with a rotation, position vector and an array of 4 3d coords (its just a plane) a camera with a position and rotation vector the viewport - a square 600 x 600 surface. The example uses a zoom factor which I've set as 1 Most hits on google use either matrix calculations or don't implement camera rotation. Basic transformation should be like this: screen.x = x / z * zoom screen.y = y / z * zoom Can anyone point me in the right direction or explain to me howto achieve this? edit: Thanks for all your posts, I haven't been able to apply all this to my project yet but I hope to do this soon.

    Read the article

  • One of my most frequently used commands

    - by Kevin Smith
    On a Linux or UNIX server this is one of my most frequently used commands. find . -name "*.htm" -exec grep -iH "alter session" {} \; It is an easy way to find a string you know is in a group of files, but don't know or can't remember which file it is in. For the example above, I knew that WebCenter Content sends a bunch of alter session commands to the database when it opens a new database connection. I wanted to find where these were defined and what all the alter session commands were. So, I ran these commands: cd /opt/oracle/middleware/Oracle_ECM1/ucm/idc/resources/core find . -name "*.htm" -exec grep -iH "alter session" {} \; And the results were: ./tables/query.htm: ALTER SESSION SET optimizer_mode = ?./tables/query.htm: ALTER SESSION SET NLS_LENGTH_SEMANTICS = ?./tables/query.htm: ALTER SESSION SET NLS_SORT = ?./tables/query.htm: ALTER SESSION SET NLS_COMP = ?./tables/query.htm: ALTER SESSION SET CURSOR_SHARING = ?./tables/query.htm: ALTER SESSION SET EVENTS '30579 trace name context forever, level 2'./tables/query.htm: ALTER SESSION SET NLS_DATE_FORMAT = ?./tables/query.htm: alter session set events '30579 trace name context forever, level 2' I could then go edit the query.htm file and find the include that contained all the ALTER SESSION commands.

    Read the article

  • Securely automount encrypted drive at user login

    - by Tom Brossman
    An encrypted /home directory gets mounted automatically for me when I log in. I have a second internal hard drive that I've formatted and encrypted with Disk Utility. I want it to be automatically mounted when I login, just like my encrypted /home directory is. How do I do this? There are several very similar questions here, but the answers don't apply to my situation. It might be best to close/merge my question here and edit the second one below, but I think it may have been abandoned (and therefore never to be marked as accepted). This solution isn't a secure method, it circumvents the encryption. This one requires editing fstab, which necessitates entering an additional password at boot. It's not automatic like mounting /home. This question is very similar, but does not apply to an encrypted drive. The solution won't work for my needs. Here is one but it's for NTFS drives, mine is ext4. I can re-format and re-encrypt the second drive if a solution requires this. I've got all the data backed up elsewhere.

    Read the article

  • How to maintain symlinks in linux file manager?

    - by MountainX
    I want to use symlinks extensively. However, if I move the target file, the symlink becomes broken (unlike on Windows). That's not acceptable to me, so I either need a solution or I won't be able to use symlinks the way I wish to. Is there a solution that will work with Dolphin file manager? A command line solution is described on commandlinefu. In summary, it is something like one of these: lmv(){for a in ${@:1:$(expr $#-1)};do [ -e "$a" -a -e "${@:$#:1}" ] && mv "$a";"${@:$#:1}" && ln -s "${@:$#:1}"/"$(basename "$a")";"$(dirname "$a")";done} lmv(){for a in ${@:1:$(expr $#-1)};do [ -e "$a" -a -e "${@:$#}" ] && mv "$a";"${@:$#}" && ln -s "${@:$#}"/"$(basename "$a")";"$(dirname "$a")";done} But about half the time I'm using a file manager (Dolphin), so I need a complete solution to this problem. Is a solution available for a GUI file manager? EDIT: The context of this question is that I'm searching for an alternative to hardlinks. I previously asked this question about the pitfalls of hardlinks.

    Read the article

  • How disable Apple iCal from popping up with every email invite/update?

    - by Sean
    My iCal has new behavior (since upgrading to SL). Every time I get an ical attachment in Mail, the iCal app flies up in my face. I don't see any way to turn off this behavior and it's amazingly disruptive when I'm busy with other activities. Help? EDIT: I want iCal to add the invitations, so when I cmd-tab to the app those items are in the queue awaiting approval. What I am hoping to learn is how to stop the popup action forcing the application to become the top-level window.

    Read the article

  • Exchange 2010 Deployment Notes - ISA 2004 Server Issue

    - by BWCA
    An interesting ISA 2004 tidbit … While we were setting up our Exchange 2010 ActiveSync environment, we encountered a problem where we could not successfully telnet over port 443 from one of our ISA 2004 Servers to our Exchange 2010 Client Access Server Array. When we tried to telnet over port 443 from the ISA Server to the Client Access Server Array name, we would get a “Could not open connection to the host on port 443: Connect failed” error message. Also, when we used portqry over port 443 from the ISA Server to the Client Access Server Array name, we would get a “Error opening socket: 10065” and “No route to host” error messages. It was odd because we did not have any problems with using ping or tracert from the ISA Server to the Client Access Server Array and our firewall firewall policy was allowing 443 traffic to pass through. After some troubleshooting, we were able to telnet and use portqry over port 443 successfully if we stopped the Microsoft Firewall service on the ISA 2004 Server.  So, it was strictly a problem with ISA.  Eventually, we were able to isolate the problem to a ISA 2004 Server System Policy setting as shown below (to modify the System Policy, right-click Firewall Policy and click Edit System Policy). Under the Diagnostics Services – HTTP Connectivity verifiers Configuration Group, you need to enable the configuration group under the General tab to resolve the problem.  After we enabled the setting, we no longer had a problem.

    Read the article

  • Exchange mail users cannot send to certain lists

    - by blsub6
    First of all, everyone's on Exchange 2010 using OWA I have a dynamic distribution list that contains all users in my domain called 'staff'. I can send to this list, other people can send to this list, but I have one user that cannot send to this list. Sending to this list gives the user an email back with the error: Delivery has failed to these recipients or groups: Staff The e-mail address you entered couldn't be found. Please check the recipient's e-mail address and try to resend the message. If the problem continues, please contact your helpdesk. and then a bunch of diagnostic information that I don't want to paste here because I don't want to have to censor all of the sensitive information contained (lazy) Can you guys throw me some possible reasons why this would happen? If there are an innumerable number of reasons, where should I start to troubleshoot this? EDIT One Exchange server inside the network that acts as a transport server, client access server and mailbox server and one Edge Transport server in the DMZ.

    Read the article

< Previous Page | 453 454 455 456 457 458 459 460 461 462 463 464  | Next Page >