Search Results

Search found 75091 results on 3004 pages for 'person who needs help'.

Page 650/3004 | < Previous Page | 646 647 648 649 650 651 652 653 654 655 656 657  | Next Page >

  • sudoers file cleanup and consolidation tool/script

    - by Prashanth Sundaram
    Hello All, I am curious to know what other folks out there might be using to keep the sudoers file in a sane manner. I am looking for a tool, that removes redundant entries, overlapping permissions and/or present sudoers file in a organized way(like sorting by permissions/users/Aliases) User_Alias RT1123 jappleseed, sjobs Host_Alias HOST_RT1123 wdc101.domain.com, wdc104.domain.com Cmnd_Alias ..... Our sudoers file is simple but a lot of entries and it needs to be cleaned up. Does anyone know/have a tool/script to fix/present it ? Thanks!

    Read the article

  • sudoers file cleanup and consolidation tool/script

    - by Prashanth Sundaram
    Hello All, I am curious to know what other folks out there might be using to keep the sudoers file in a sane manner. I am looking for a tool, that removes redundant entries, overlapping permissions and/or present sudoers file in a organized way(like sorting by permissions/users/Aliases). I use SVN and Confi Mgmt. tool to version control and deploy resp. Is there any add-on/plugin you would recommend/use? User_Alias RT1123 jappleseed, sjobs Host_Alias HOST_RT1123 wdc101.domain.com, wdc104.domain.com Cmnd_Alias ..... Our sudoers file is simple but a lot of entries and it needs to be cleaned up. Does anyone know/have a tool/script to fix/present it ? Thanks!

    Read the article

  • rails bundler error installing nokigiri (1.5.5), and Bundler cannot continue

    - by Michael Durrant
    An error occurred while installing nokogiri (1.5.5), and Bundler cannot continue How to fix and get past the error? Installing nokogiri (1.5.5) with native extensions Gem::Installer::ExtensionBuildError: ERROR: Failed to build gem native extension. /usr/bin/ruby1.8 extconf.rb checking for libxml/parser.h... yes checking for libxslt/xslt.h... no ----- libxslt is missing. please visit http://nokogiri.org/tutorials/installing_nokogiri.html for help with installing dependencies.

    Read the article

  • MS Server 2003 Activiation Loop

    - by RPGonzo
    Recently we had a motherboard failure on a terminal server so we replaced the faulty motherboard, re-setup the RAID arrays (same motherboard but still wouldn't recognize old RAID setup) and continued to recover the system from a previous backup. No problem up to here, after restoring the system you are prompted to reboot and than login. On login we get a message box stating that Windows needs to be activated do you want to activate now, press yes but than the OS proceeds to log you off and do nothing at all. You can try over and over but to no avail. Found a few articles about a glitch in the activation script and how to reset it, tried that with the same results. Hoping someone can share some knowledge if you have seen this before? Thanks!

    Read the article

  • Willy Rotstein on Supply Chain Planning

    - by sarah.taylor(at)oracle.com
    Each time a merchandiser, buyer or planner in Retail makes a business decision around assortment, inventory, pricing and promotions there is an opportunity to improve both Profitability and Customer Service. Improving decision making, however, has always been a tricky business for retailers.  I have worked in this space for more than 15 years. I began my career as an academic, at Imperial College London, and then broadened this interest with Retailers, aiming to optimize their merchandising and supply chain decisions. Planning the business and optimizing profit is a complex process. The complexity arises from the variety of people involved, the large number of decisions to take across all business processes, the uncertainty intrinsic to the retail environment as well as the volume of data available for analysis.  Things are not getting any easier either. The advent of multi-channel, social media and mobile is taking these complexities to a new level and presenting additional opportunities for those willing to exploit them. I guess it is due to the complexities of the decision making process that, over the last couple of years working with Oracle Retail, I have witnessed a clear trend around the deployment of planning systems. Retailers are aiming to simplify their decision making processes. They want to use one joined up planning platform across the business and enhance it with "actionable" data mining and optimization techniques. At Oracle Retail, we have a vibrant community of international retailers who regularly come together to discuss the big issues in retail planning. It is a combination of fashion, grocery and speciality retailers, all sharing their best practice vision for planning and optimizing merchandise decisions. As part of the Retail Exchange program, at the recent National Retail Federation event in New York, I jointly hosted a Planning dinner with Peter Fitzgerald from Google UK, Retail Division. Those retailers from our international planning community who were in New York for the annual NRF event were able to attend. The group comprised some of Europe's great International Retail brands.  All sectors were represented by organisations like Mango, LVMH, Ahold, Morrisons, Shop Direct and River Island. They confirmed the current importance of engaging with Planning and Optimization issues. In particular the impact of the internet was a key topic. We had a great debate about new retail initiatives.  Peter highlighted how mobility is changing retail - in particular with the new "local availability search" initiative. We also had an exciting discussion around the opportunities to improve merchandising using the new data that is becoming available from search, social media and ecommerce sites. It will be our focus to continue to help retailers translate this data into better results while keeping their business operations simple. New developments in "actionable" analytics and computing capacity make this a very exciting area today. Watch this space for my contributions on these topics which will be made available through this blog. Oracle Retail has a strong Planning community. if you are a category manager, a planner, a buyer, a merchandiser, a retail supplier or any retail executive with a keen interest in planning then you would be very welcome to join Oracle Retail's Planning Community. As part of our community you will be able to join our in-person and virtual events, download topical white papers and best practice information specifically tailored to your area of interest.  If anyone would like to register their interest in joining our community of retailers discussing planning then please contact me at [email protected]   Willy Rotstein, Oracle Retail

    Read the article

  • Use Autoruns to Manually Clean an Infected PC

    - by Mark Virtue
    There are many anti-malware programs out there that will clean your system of nasties, but what happens if you’re not able to use such a program?  Autoruns, from SysInternals (recently acquired by Microsoft), is indispensable when removing malware manually. There are a few reasons why you may need to remove viruses and spyware manually: Perhaps you can’t abide running resource-hungry and invasive anti-malware programs on your PC You might need to clean your mom’s computer (or someone else who doesn’t understand that a big flashing sign on a website that says “Your computer is infected with a virus – click HERE to remove it” is not a message that can necessarily be trusted) The malware is so aggressive that it resists all attempts to automatically remove it, or won’t even allow you to install anti-malware software Part of your geek credo is the belief that anti-spyware utilities are for wimps Autoruns is an invaluable addition to any geek’s software toolkit.  It allows you to track and control all programs (and program components) that start automatically with Windows (or with Internet Explorer).  Virtually all malware is designed to start automatically, so there’s a very strong chance that it can be detected and removed with the help of Autoruns. We have covered how to use Autoruns in an earlier article, which you should read if you need to first familiarize yourself with the program. Autoruns is a standalone utility that does not need to be installed on your computer.  It can be simply downloaded, unzipped and run (link below).  This makes is ideally suited for adding to your portable utility collection on your flash drive. When you start Autoruns for the first time on a computer, you are presented with the license agreement: After agreeing to the terms, the main Autoruns window opens, showing you the complete list of all software that will run when your computer starts, when you log in, or when you open Internet Explorer: To temporarily disable a program from launching, uncheck the box next to it’s entry.  Note:  This does not terminate the program if it is running at the time – it merely prevents it from starting next time.  To permanently prevent a program from launching, delete the entry altogether (use the Delete key, or right-click and choose Delete from the context-menu)).  Note:  This does not remove the program from your computer – to remove it completely you need to uninstall the program (or otherwise delete it from your hard disk). Suspicious Software It can take a fair bit of experience (read “trial and error”) to become adept at identifying what is malware and what is not.  Most of the entries presented in Autoruns are legitimate programs, even if their names are unfamiliar to you.  Here are some tips to help you differentiate the malware from the legitimate software: If an entry is digitally signed by a software publisher (i.e. there’s an entry in the Publisher column) or has a “Description”, then there’s a good chance that it’s legitimate If you recognize the software’s name, then it’s usually okay.  Note that occasionally malware will “impersonate” legitimate software, but adopting a name that’s identical or similar to software you’re familiar with (e.g. “AcrobatLauncher” or “PhotoshopBrowser”).  Also, be aware that many malware programs adopt generic or innocuous-sounding names, such as “Diskfix” or “SearchHelper” (both mentioned below). Malware entries usually appear on the Logon tab of Autoruns (but not always!) If you open up the folder that contains the EXE or DLL file (more on this below), an examine the “last modified” date, the dates are often from the last few days (assuming that your infection is fairly recent) Malware is often located in the C:\Windows folder or the C:\Windows\System32 folder Malware often only has a generic icon (to the left of the name of the entry) If in doubt, right-click the entry and select Search Online… The list below shows two suspicious looking entries:  Diskfix and SearchHelper These entries, highlighted above, are fairly typical of malware infections: They have neither descriptions nor publishers They have generic names The files are located in C:\Windows\System32 They have generic icons The filenames are random strings of characters If you look in the C:\Windows\System32 folder and locate the files, you’ll see that they are some of the most recently modified files in the folder (see below) Double-clicking on the items will take you to their corresponding registry keys: Removing the Malware Once you’ve identified the entries you believe to be suspicious, you now need to decide what you want to do with them.  Your choices include: Temporarily disable the Autorun entry Permanently delete the Autorun entry Locate the running process (using Task Manager or similar) and terminating it Delete the EXE or DLL file from your disk (or at least move it to a folder where it won’t be automatically started) or all of the above, depending upon how certain you are that the program is malware. To see if your changes succeeded, you will need to reboot your machine, and check any or all of the following: Autoruns – to see if the entry has returned Task Manager (or similar) – to see if the program was started again after the reboot Check the behavior that led you to believe that your PC was infected in the first place.  If it’s no longer happening, chances are that your PC is now clean Conclusion This solution isn’t for everyone and is most likely geared to advanced users. Usually using a quality Antivirus application does the trick, but if not Autoruns is a valuable tool in your Anti-Malware kit. Keep in mind that some malware is harder to remove than others.  Sometimes you need several iterations of the steps above, with each iteration requiring you to look more carefully at each Autorun entry.  Sometimes the instant that you remove the Autorun entry, the malware that is running replaces the entry.  When this happens, we need to become more aggressive in our assassination of the malware, including terminating programs (even legitimate programs like Explorer.exe) that are infected with malware DLLs. Shortly we will be publishing an article on how to identify, locate and terminate processes that represent legitimate programs but are running infected DLLs, in order that those DLLs can be deleted from the system. Download Autoruns from SysInternals Similar Articles Productive Geek Tips Using Autoruns Tool to Track Startup Applications and Add-onsHow To Get Detailed Information About Your PCSUPERAntiSpyware Portable is the Must-Have Spyware Removal Tool You NeedQuick Tip: Windows Vista Temp Files DirectoryClear Recent Commands From the Run Dialog in Windows XP TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional 15 Great Illustrations by Chow Hon Lam Easily Sync Files & Folders with Friends & Family Amazon Free Kindle for PC Download Stretch popurls.com with a Stylish Script (Firefox) OldTvShows.org – Find episodes of Hitchcock, Soaps, Game Shows and more Download Microsoft Office Help tab

    Read the article

  • SQLAuthority News – Book Signing Event – SQLPASS 2011 Event Log

    - by pinaldave
    I have been dreaming of writing book for really long time, and I finally got the chance – in fact, two chances!  I recently wrote two books: SQL Programming Joes 2 Pros: Programming and Development for Microsoft SQL Server 2008 [Amazon] | [Flipkart] | [Kindle] and SQL Wait Stats Joes 2 Pros: SQL Performance Tuning Techniques Using Wait Statistics, Types & Queues [Amazon] | [Flipkart] | [Kindle].  I had a lot of fun writing these two books, even though sometimes I had to sacrifice some family time and time for other personal development to write the books. The good side of writing book is that when the efforts put in writing books are recognize by books readers and kind organizations like expressor studio. Book Signing Event Book writing is a complex process.  Even after you spend months, maybe years, writing the material you still have to go through the editing and fact checking processes.  And, once the book is out there, there is no way to take back all the copies to change mistakes or add something you forgot.  Most of the time it is a one-way street. Book Signing Event Just like every author, I had a dream that after the books were written, they would be loved by people and gain acceptance by an audience. My first book, SQL Programming Joes 2 Pros: Programming and Development for Microsoft SQL Server 2008, is extremely popular because it helps lots of people learn various fundamental topics. My second book covers beginning to learn SQL Server Wait Stats, which is a relatively new subject. This book has had very good acceptance in the community. Book Signing Event Helping my community is my primary focus, so I was happy to see this year’s SQLPASS tag line: ‘This is a Community.‘ At the event, the expressor studio guys came up with a very novel idea. They had previously used my books and they had found them very useful. They got 100 copies of the book and decided to give it away to community folks. They invited me and my co-author Rick Morelan to hold a book signing event. We did a book signing on Thursday between 1 pm and 2 pm. Book Signing Event This event was one of the best events for me. This was my first book signing event outside of India. I reached the book signing location around 20 minutes before the scheduled time and what I saw was a big line for the book signing event. I felt very honored looking at the crowd and all the people around the event location. I felt very humbled when I saw some of my very close friends standing in the line to get my signature. It was really heartwarming to see so many enthusiasts waiting for more than an hour to get my signature. While standing in line I had the chance to have a conversation with every single person who showed up for the signature. I made sure that I repeated every single name and wrote it in every book with my signature. There is saying that if we write a name once we will remember it forever. I want to remember all of you who saw me at the book signing. Your comments were wonderful, your feedback was amazing and you were all very supportive. Book Signing Event I have made a note of every conversation I had with all of you when I was signing the books. Once again, I just want to express my thanks for coming to my book signing event. The whole experience was very humbling. On the top of it, I want to thank the expressor studio people who made it possible, who organized the whole signing event. I am so thankful to them for facilitating the whole experience, which is going to be hard to beat by any future experience. My books Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL PASS, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority Author Visit, SQLAuthority News, T SQL, Technology

    Read the article

  • ID number for sites [closed]

    - by Jonathan
    Possible Duplicate: please add a key fields to stackauth results It would be easier if sites each had an ID, it help with keeping track of them, not only in a numerical way (which is generally easier and smaller than using name strings). Also changes of site names (such as when a site progresses from beta, or decides it's name is not quite right during beta). Everything else has IDs so why not sites?

    Read the article

  • how to run conky from terminal?

    - by Esmail0022
    http://www.unixmen.com/configure-con...t-howto-conky/ in this link there are 11 steps to get conky , i did all of them but the terminal show this message: The program 'conky' can be found in the following packages: * conky-cli * conky-std Try: sudo apt-get install and i try type this but saw this message: ismail@ismail-ASUS:~$ sudo apt-get install conky [sudo] password for ismail: E: Could not get lock /var/lib/dpkg/lock - open (11: Resource temporarily unavailable) E: Unable to lock the administration directory (/var/lib/dpkg/), is another process using it? Can you help me?

    Read the article

  • What are the disadvantages of automated testing?

    - by jkohlhepp
    There are a number of questions on this site that give plenty of information about the benefits that can be gained from automated testing. But I didn't see anything that represented the other side of the coin: what are the disadvantages? Everything in life is a tradeoff and there are no silver bullets, so surely there must be some valid reasons not to do automated testing. What are they? Here's a few that I've come up with: Requires more initial developer time for a given feature Requires a higher skill level of team members Increase tooling needs (test runners, frameworks, etc.) Complex analysis required when a failed test in encountered - is this test obsolete due to my change or is it telling me I made a mistake? Edit I should say that I am a huge proponent of automated testing, and I'm not looking to be convinced to do it. I'm looking to understand what the disadvantages are so when I go to my company to make a case for it I don't look like I'm throwing around the next imaginary silver bullet. Also, I'm explicity not looking for someone to dispute my examples above. I am taking as true that there must be some disadvantages (everything has trade-offs) and I want to understand what those are.

    Read the article

  • Make it simple. Make it work.

    - by Sean Feldman
    In 2010 I had an experience to work for a business that had lots of challenges. One of those challenges was luck of technical architecture and business value recognition which translated in spending enormous amount of manpower and money on creating C++ solutions for desktop client w/o using .NET to minimize “footprint” (2#) of the client application in deployment environments. This was an awkward experience, considering that C++ custom code was created from scratch to make clients talk to .NET backend while simple having .NET as a dependency would cut time to market by at least 50% (and I’m downplaying the estimate). Regardless, recent Microsoft announcement about .NET vNext has reminded me that experience and how short sighted architecture at that company was. Investment made into making C++ client that cannot be maintained internally by team due to it’s specialization in .NET have created a situation where code to maintain will be more brutal over the time and  number of developers understanding it will be going and shrinking. Not only that. The ability to go cross-platform (#3) and performance achievement gained with native compilation (#1) would be an immediate pay back. Why am I saying all this? To make a simple point to myself and remind again – when working on a product that needs to get to the market, make it simple, make it work, and then see how technology is changing and how you can adopt. Simplicity will not let you down. But a complex solution will always do.

    Read the article

  • sftp to cygwin cmd fails with "received message too long"

    - by ana
    I have a windows server running cygwin, where open-ssh is set up. I am able to ssh into the server fine. I need to run a script that I cannot amend, and which needs to copy files via sftp, and requires the shell to be cmd. However, it fails with the message "received message too long" because when cmd starts it displays the Windows banner, and sftp chokes on this. I can't find any way to disable the banner, or redirect the output, or in any way get around this problem. Does anyone know how I can fix this?

    Read the article

  • Lubuntu Stability Problems

    - by marinara
    I installed Lubuntu 11.10 from scratch a few days ago, now I can't even run aptitude without crashing from the mess it's made of the package system. (yes I added a non-offical repository, because I needed firefox 8) I'm a new user. So I don't know. Did I make a mistake by loading Lubuntu? I really like the LXDE stuff. but so many things seem very buggy. ***edit I think besides the sound problems, all of my other problems are with either LXDE or with openbox. And I think the sound problem is Lubuntu specific also. Problems like windows not popping up, windows not getting painted. Windows flashing in the panel for no reason, Windows losing their title bar occasionally, mousewheel changing desktops when i'm clearly in a window, LXDE panel not responding to clicks, applications not launching from the menu, applications not launching from firefox downloads window, applications crashing when I don't do anything but click on them. Pretty sure all this stuff is openbox or LXDE. This is just too much work to keep Lubuntu running. It's a lemon. (sorry I realize I should be constructive but I'm tired) It's pretty clear that openbox needs some damn bugfixes for compatibility with Oneric. And LXDE isn't that great either. I expect things to launch when I click on them. I think installing the acpid package is what made it clear to me. If 20 seconds after installing the package (from the offical repo), linux has already crashed, then I'm in the wrong damn neighborhood.

    Read the article

  • Why are they putting "processors" on hard drives?

    - by Celeritas
    What does it mean when they have a processor on the hard drive, how does it work, and what benfit does it have? I don't understand - the CPU is the processor and the hard drive transfers it's contents to RAM. Do have additional processors, preprocess the data some how? Here's some examples Western Digital WD Black WD1002FAEX 1TB "Dual processor speed" NETGEAR ReadyNAS 312 2-Bay Diskless Network Attached Storage "Dual-core Intel 2.1GHz processor and 2GB on-board memory" and routers now have processors too, why's that nescecary? I guess it sort of makes sense - some logic needs to happen for the packets to be read in to know which ports to send them out on, but why did old routers not need them? Example or wireless router with processor: "Dual-core processor"

    Read the article

  • blurry image rendered

    - by Jason
    I'm using Direct2D to render a PNG image using a ID2D1BitmapRenderTarget and then caling it's GetBitmap() function and rendering the image using ID2D1HwndRenderTarget::DrawBitmap(). Some of the images rendered this way are clear but others appear blurry. I did some research and followed a tutorial to make my application "DPI Aware" but it didn't help. What could cause the rendered image to appear blurry? Has anyone experienced this issue before? What can I do about this?

    Read the article

  • BizTalk 2009 - BizTalk Server Best Practice Analyser

    - by StuartBrierley
    The BizTalk Server Best Practices Analyser  allows you to carry out a configuration level verification of your BizTalk installation, evaluating the deployed configuration but not modifying or tuning anything that it finds. The Best Practices Analyser uses "reading and reporting" to gather data from different sources, such as: Windows Management Instrumentation (WMI) classes SQL Server databases Registry entries When I first ran the analyser I got a number of errors, if you get any errors these should all be acted upon to resolve them, you should then run the scan again and see if any thing else is reported that needs acting upon. As you can see in the image above, the initial issue that jumped out to me was that the SQL Server Agent was not started. The reasons for this was absent mindedness - this run was against my development PC and I don't have SQL/BizTalk actively running unless I am using them.  Starting the agent service and running the scan again gave me the following results: This resolved most of the issues for me, but next major issue to look at was that there was no tracking host running.  You can also see that I was still getting an error with two of the SQL jobs.  The problem here was that I had not yet configured these two SQL jobs.  Configuring the backup and purge jobs and then starting the tracking host before running the scan again gave: This had cleared all the critical issues, but I did stil have a number of warnings.  For example on this report I was warned that the BizTalk Message box is hosted on the BizTalk Server.  While this is known to be less than ideal, it is as I expected on my development environment where I have installed Visual Studio, SQL and BizTalk on my laptop and I was happy to ignore this and other similar warnings. In your case you should take a look at any warnings you receive and decide what you want to do about each of them in turn.

    Read the article

  • New Whitepaper: Primer on Integrating with EBS 12 with Other Applications

    - by Rekha Ayothi
    Oracle E-Business Suite offers several integration points and a variety of integration technologies. While a given integration point may be available through various technologies and products, it is important to select the best approach for your specific integration requirements. I am pleased to announce the publication of a new white paper that can help with this: Oracle E-Business Suite Release 12.1.3 - Integration Products and Technologies Primer (Note 1494997.1) This whitepaper reviews integration strategies for Oracle E-Business Suite applications that are available today. The intended audience is solution architects, integration consultants, and anyone else interested in learning about integration options with Oracle E-Business Suite. The white paper outlines the following enterprise application integration styles: Data-centric integration Integration through native interfaces Process-centric integration Event-driven integration B2B integration Integration through web services  The white paper also discusses Oracle E-Business Suite application layer products and technologies that address the specific needs of each of these integration styles. It concludes with criteria for selecting the appropriate integration-related tools and technologies for your requirements. Attending OpenWorld 2012? We have two sessions covering Oracle E-Business Suite integration. Please join us to hear more on this subject: CON9005 - Oracle E-Business Suite Integration Best Practices ( Tuesday, Oct 2, 1:15 PM - 2:15 PM - Moscone West 2018) CON8716 - Web Services and SOA Integration Options for Oracle E-Business Suite ( Thursday, Oct 4, 11:15 AM - 12:15 PM - Moscone West 2016)  Related Articles E-Business Suite Technology Sessions at OpenWorld 2012 Webcast Replay Available: SOA Integration Options for E-Business Suite BPEL 11.1.1.6 Certified for Prebuilt E-Business Suite 12.1.3 SOA Integrations New Whitepaper: Defining Web Applications Desktop Integrators That Return Error Messages

    Read the article

  • Using rounded corners in modern websites with CSS3

    - by nikolaosk
    This is going to be the sixth post in a series of posts regarding HTML 5. You can find the other posts here , here, here , here and here.In this post I will provide a hands-on example on how to use rounded corners (rounded corners in CSS3) in your website. I think this is the feature that is most required in the new modern websites.Most websites look great with their lovely round panels and rounded corner tab style menus. We could achieve that effect earlier but we should resort to complex CSS rules and images. I will show you how to accomplish this great feature with the power of CSS 3.We will not use Javascript.Javascript is required for IE 7, IE 8 and the notorious IE 6. The best solution for implementing corners using CSS and Javascript without using images is Nifty corners cube. There are detailed information how to achieve this in the link I provided. This solution is tested in earlier vesrions of IE (IE 6,IE 7,IE 8) and Opera,Firefox,Safari. In order to be absolutely clear this is not (and could not be) a detailed tutorial on HTML 5. There are other great resources for that.Navigate to the excellent interactive tutorials of W3School.Another excellent resource is HTML 5 Doctor.Two very nice sites that show you what features and specifications are implemented by various browsers and their versions are http://caniuse.com/ and http://html5test.com/. At this times Chrome seems to support most of HTML 5 specifications.Another excellent way to find out if the browser supports HTML 5 and CSS 3 features is to use the Javascript lightweight library Modernizr.In this hands-on example I will be using Expression Web 4.0.This application is not a free application. You can use any HTML editor you like.You can use Visual Studio 2012 Express edition. You can download it here.Before I go on with the actual demo I will use the (http://www.caniuse.com) to see the support for web fonts from the latest versions of modern browsers.Please have a look at the picture below. We see that all the latest versions of modern browsers support this feature.We can see that even IE 9 supports this feature.  Let's move on with the actual demo. This is going to be a rather simple demo.I create a simple HTML 5 page. The markup follows and it is very easy to use and understand <!DOCTYPE html><html lang="en">  <head>    <title>HTML 5, CSS3 and JQuery</title>    <meta http-equiv="Content-Type" content="text/html;charset=utf-8" >    <link rel="stylesheet" type="text/css" href="style.css">       </head>  <body>      <div id="header">      <h1>Learn cutting edge technologies</h1>    </div>        <div id="main">          <h2>HTML 5</h2>                        <p id="panel1">            HTML5 is the latest version of HTML and XHTML. The HTML standard defines a single language that can be written in HTML and XML. It attempts to solve issues found in previous iterations of HTML and addresses the needs of Web Applications, an area previously not adequately covered by HTML.          </p>      </div>             </body>  </html>Then I need to write the various CSS rules that style this markup. I will name it style.css   body{        line-height: 38px;        width: 1024px;        background-color:#eee;        text-align:center;      }#panel1 { margin:auto; text-align:left; background-color:#77cdef;width:400px; height:250px; padding:15px;font-size:16px;font-family:tahoma;color:#fff;border-radius: 20px;}Have a look below to see what my page looks like in IE 10. This is possible through the border-radious property. The colored panel has all four corners rounded with the same radius.We can add a border to the rounded corner panel by adding this property declaration in the #panel1,  border:4px #000 solid;We can have even better visual effects if we specify a radius for each corner.This is the updated version of the style.css. body{        line-height: 38px;        width: 1024px;        background-color:#eee;        text-align:center;      }#panel1 { margin:auto; text-align:left; background-color:#77cdef;border:4px #000 solid;width:400px; height:250px; padding:15px;font-size:16px;font-family:tahoma;color:#fff;border-top-left-radius: 20px;border-top-right-radius: 70px;border-bottom-right-radius: 20px;border-bottom-left-radius: 70px;} This is how my page looks in Firefox 15.0.1  In this final example I will show you how to style with CSS 3 (rounded corners) a horizontal navigation menu. This is the new version of the HTML markup<!DOCTYPE html><html lang="en">  <head>    <title>HTML 5, CSS3 and JQuery</title>    <meta http-equiv="Content-Type" content="text/html;charset=utf-8" >    <link rel="stylesheet" type="text/css" href="style.css">       </head>  <body>      <div id="header">      <h1>Learn cutting edge technologies</h1>    </div>        <div id="nav"><ul><li><a class="mymenu" id="activelink" href="http://weblogs.asp.net/controlpanel/blogs/posteditor.aspx?SelectedNavItem=Posts§ionid=1153&postid=8934038#">Main</a></li><li><a class="mymenu" href="http://weblogs.asp.net/controlpanel/blogs/posteditor.aspx?SelectedNavItem=Posts§ionid=1153&postid=8934038#">HTML 5</a></li><li><a class="mymenu" href="http://weblogs.asp.net/controlpanel/blogs/posteditor.aspx?SelectedNavItem=Posts§ionid=1153&postid=8934038#">CSS 3</a></li><li><a class="mymenu" href="http://weblogs.asp.net/controlpanel/blogs/posteditor.aspx?SelectedNavItem=Posts§ionid=1153&postid=8934038#">JQuery</a></li></ul></div>        <div id="main">          <h2>HTML 5</h2>                        <p id="panel1">            HTML5 is the latest version of HTML and XHTML. The HTML standard defines a single language that can be written in HTML and XML. It attempts to solve issues found in previous iterations of HTML and addresses the needs of Web Applications, an area previously not adequately covered by HTML.          </p>      </div>             </body>  </html> This is the updated version of style.css body{        line-height: 38px;        width: 1024px;        background-color:#eee;        text-align:center;      }#panel1 { margin:auto; text-align:left; background-color:#77cdef;border:4px #000 solid;width:400px; height:250px; padding:15px;font-size:16px;font-family:tahoma;color:#fff;border-top-left-radius: 20px;border-top-right-radius: 70px;border-bottom-right-radius: 20px;border-bottom-left-radius: 70px;}#nav ul {width:900px; position:relative;top:24px;}ul li { text-decoration:none; display:inline;}ul li a.mymenu { font-family:Tahoma; color:black; font-size:14px;font-weight:bold;background-color:#77cdef; color:#fff;border-top-left-radius:18px; border-top-right-radius:18px; border:1px solid black; padding:15px; padding-bottom:10px;margin :2px; text-decoration:none; border-bottom:none;}.mymenu:hover { background-color:#e3781a; color:black;} The CSS rules are the classic rules that are extensively used for styling menus.The border-radius property is still responsible for the rounded corners in the menu.This is how my page looks in Chrome version 21.  Hope it helps!!!

    Read the article

  • Why every change in Asus UX50v causes screen resolution get lost?

    - by Kaveh Shahbazian
    Why every change in Asus UX50v causes screen resolution get lost? Installing a new application, connecting to another wireless network, change some settings, ... causes this problem. For example after installing an application, UX50v needs to restart. And when it got restarted, resolution would be set to 640x480(or 600x800) and Hibernate and Sleep options are disappeared from shutdown menu! (I have other problems with this Asus UX50v crap too - like I can't update windows 7 because it crashes on Asus UX50v crap - but this one is absolutely ridiculous and stupid!)

    Read the article

  • Making a Case For The Command Line

    - by Jesse Taber
    Originally posted on: http://geekswithblogs.net/GruffCode/archive/2013/06/30/making-a-case-for-the-command-line.aspxI have had an idea percolating in the back of my mind for over a year now that I’ve just recently started to implement. This idea relates to building out “internal tools” to ease the maintenance and on-going support of a software system. The system that I currently work on is (mostly) web-based, so we traditionally we have built these internal tools in the form of pages within the app that are only accessible by our developers and support personnel. These pages allow us to perform tasks within the system that, for one reason or another, we don’t want to let our end users perform (e.g. mass create/update/delete operations on data, flipping switches that turn paid modules of the system on or off, etc). When we try to build new tools like this we often struggle with the level of effort required to build them. Effort Required Creating a whole new page in an existing web application can be a fairly large undertaking. You need to create the page and ensure it will have a layout that is consistent with the other pages in the app. You need to decide what types of input controls need to go onto the page. You need to ensure that everything uses the same style as the rest of the site. You need to figure out what the text on the page should say. Then, when you figure out that you forgot about an input that should really be present you might have to go back and re-work the entire thing. Oh, and in addition to all of that, you still have to, you know, write the code that actually performs the task. Everything other than the code that performs the task at hand is just overhead. We don’t need a fancy date picker control in a nicely styled page for the vast majority of our internal tools. We don’t even really need a page, for that matter. We just need a way to issue a command to the application and have it, in turn, execute the code that we’ve written to accomplish a given task. All we really need is a simple console application! Plumbing Problems A former co-worker of mine, John Sonmez, always advocated the Unix philosophy for building internal tools: start with something that runs at the command line, and then build a UI on top of that if you need to. John’s idea has a lot of merit, and we tried building out some internal tools as simple Console applications. Unfortunately, this was often easier said that done. Doing a “File –> New Project” to build out a tool for a mature system can be pretty daunting because that new project is totally empty.  In our case, the web application code had a lot of of “plumbing” built in: it managed authentication and authorization, it handled database connection management for our multi-tenanted architecture, it managed all of the context that needs to follow a user around the application such as their timezone and regional/language settings. In addition, the configuration file for the web application  (a web.config in our case because this is an ASP .NET application) is large and would need to be reproduced into a similar configuration file for a Console application. While most of these problems are could be solved pretty easily with some refactoring of the codebase, building Console applications for internal tools still potentially suffers from one pretty big drawback: you’d have to execute them on a machine with network access to all of the needed resources. Obviously, our web servers can easily communicate the the database servers and can publish messages to our service bus, but the same is not true for all of our developer and support personnel workstations. We could have everyone run these tools remotely via RDP or SSH, but that’s a bit cumbersome and certainly a lot less convenient than having the tools built into the web application that is so easily accessible. Mix and Match So we need a way to build tools that are easily accessible via the web application but also don’t require the overhead of creating a user interface. This is where my idea comes into play: why not just build a command line interface into the web application? If it’s part of the web application we get all of the plumbing that comes along with that code, and we’re executing everything on the web servers which means we’ll have access to any external resources that we might need. Rather than having to incur the overhead of creating a brand new page for each tool that we want to build, we can create one new page that simply accepts a command in text form and executes it as a request on the web server. In this way, we can focus on writing the code to accomplish the task. If the tool ends up being heavily used, then (and only then) should we consider spending the time to build a better user experience around it. To be clear, I’m not trying to downplay the importance of building great user experiences into your system; we should all strive to provide the best UX possible to our end users. I’m only advocating this sort of bare-bones interface for internal consumption by the technical staff that builds and supports the software. This command line interface should be the “back end” to a highly polished and eye-pleasing public face. Implementation As I mentioned at the beginning of this post, this is an idea that I’ve had for awhile but have only recently started building out. I’ve outlined some general guidelines and design goals for this effort as follows: Text in, text out: In the interest of keeping things as simple as possible, I want this interface to be purely text-based. Users will submit commands as plain text, and the application will provide responses in plain text. Obviously this text will be “wrapped” within the context of HTTP requests and responses, but I don’t want to have to think about HTML or CSS when taking input from the user or displaying responses back to the user. Task-oriented code only: After building the initial “harness” for this interface, the only code that should need to be written to create a new internal tool should be code that is expressly needed to accomplish the task that the tool is intended to support. If we want to encourage and enable ourselves to build good tooling, we need to lower the barriers to entry as much as possible. Built-in documentation: One of the great things about most command line utilities is the ‘help’ switch that provides usage guidelines and details about the arguments that the utility accepts. Our web-based command line utility should allow us to build the documentation for these tools directly into the code of the tools themselves. I finally started trying to implement this idea when I heard about a fantastic open-source library called CLAP (Command Line Auto Parser) that lets me meet the guidelines outlined above. CLAP lets you define classes with public methods that can be easily invoked from the command line. Here’s a quick example of the code that would be needed to create a new tool to do something within your system: 1: public class CustomerTools 2: { 3: [Verb] 4: public void UpdateName(int customerId, string firstName, string lastName) 5: { 6: //invoke internal services/domain objects/hwatever to perform update 7: } 8: } This is just a regular class with a single public method (though you could have as many methods as you want). The method is decorated with the ‘Verb’ attribute that tells the CLAP library that it is a method that can be invoked from the command line. Here is how you would invoke that code: Parser.Run(args, new CustomerTools()); Note that ‘args’ is just a string[] that would normally be passed passed in from the static Main method of a Console application. Also, CLAP allows you to pass in multiple classes that define [Verb] methods so you can opt to organize the code that CLAP will invoke in any way that you like. You can invoke this code from a command line application like this: SomeExe UpdateName -customerId:123 -firstName:Jesse -lastName:Taber ‘SomeExe’ in this example just represents the name of .exe that is would be created from our Console application. CLAP then interprets the arguments passed in order to find the method that should be invoked and automatically parses out the parameters that need to be passed in. After a quick spike, I’ve found that invoking the ‘Parser’ class can be done from within the context of a web application just as easily as it can from within the ‘Main’ method entry point of a Console application. There are, however, a few sticking points that I’m working around: Splitting arguments into the ‘args’ array like the command line: When you invoke a standard .NET console application you get the arguments that were passed in by the user split into a handy array (this is the ‘args’ parameter referenced above). Generally speaking they get split by whitespace, but it’s also clever enough to handle things like ignoring whitespace in a phrase that is surrounded by quotes. We’ll need to re-create this logic within our web application so that we can give the ‘args’ value to CLAP just like a console application would. Providing a response to the user: If you were writing a console application, you might just use Console.WriteLine to provide responses to the user as to the progress and eventual outcome of the command. We can’t use Console.WriteLine within a web application, so I’ll need to find another way to provide feedback to the user. Preferably this approach would allow me to use the same handler classes from both a Console application and a web application, so some kind of strategy pattern will likely emerge from this effort. Submitting files: Often an internal tool needs to support doing some kind of operation in bulk, and the easiest way to submit the data needed to support the bulk operation is in a file. Getting the file uploaded and available to the CLAP handler classes will take a little bit of effort. Mimicking the console experience: This isn’t really a requirement so much as a “nice to have”. To start out, the command-line interface in the web application will probably be a single ‘textarea’ control with a button to submit the contents to a handler that will pass it along to CLAP to be parsed and run. I think it would be interesting to use some javascript and CSS trickery to change that page into something with more of a “shell” interface look and feel. I’ll be blogging more about this effort in the future and will include some code snippets (or maybe even a full blown example app) as I progress. I also think that I’ll probably end up either submitting some pull requests to the CLAP project or possibly forking/wrapping it into a more web-friendly package and open sourcing that.

    Read the article

  • PPTPD with PAM authentication?

    - by Richard
    I need a VPN solution for my company. One requirement is to be able to use the built-in windows VPN client. We are running a Debian Etch server. I've managed to set up PPTPD but the authentication is based on the chap-secrets file. We already have all the user accounts set up on the server, so it'd be nice to use PAM authentication to get user/pass directly from the unix login. Is this possible to achieve and how? If not, is there any other VPN solution that can do this? Don't tell me OpenVPN, it needs additional software to be installed on the Windows machines. :)

    Read the article

  • eSeminar: Oracle’s Fusion Update for Partners

    - by Richard Lefebvre
    Oracle’s Fusion Update for PartnersThursday, November 17th  - 6pm CET At OOW, Oracle unveiled Oracle Fusion Applications, the next generation of business applications. By setting the standard for application architecture, design and deployment, customers will be able to extend the value of their applications environment by using Oracle Fusion Applications components side-by-side with their existing applications portfolio. Delivered as a complete suite of modular applications, Oracle Fusion Applications coexist with existing Oracle Applications. As one module, a product family or the entire suite, customers can choose to leverage the advances pioneered by Oracle at a pace that matches business needs for a new level of performance. David Bowin, Director of Oracle’s Fusion Applications Team, will host a eSeminar sessions to address various questions that our partners have regarding Oracle’s Fusion Applications.   See the schedule below and mark your calendar to attend. 9:00am - 10:00am Pacific (6pm CET) Click this link to add the event to your calendar: http://oukc.oracle.com/static11/opn/ics/98300.icsDial-In:  1. 877-664-9137  /   Passcode 98300International:  706-634-9619  http://www.intercall.com/national/oracleuniversity/gdnam.html Access Live Event Learning Link:  http://oukc.oracle.com/static09/opn/login/?t=livewebcast|c=1069641479 Webconference access-- http://ouweb.webex.comSession number: 591807958 

    Read the article

  • CSS/HTML element formatting template

    - by Elgoog
    Im looking for a html template that has all the different types of html elements, so then I can take this template and start the css process. this way I can look at the full style of the elements without creating the styles for the different elements as I go. I realize I could do this myself but thought that some one else may have already done this or know where to find such a template. Thanks for your help.

    Read the article

  • Third-party open-source projects in .NET and Ruby and NIH syndrome

    - by Anton Gogolev
    The title might seem to be inflammatory, but it's here to catch your eye after all. I'm a professional .NET developer, but I try to follow other platforms as well. With Ruby being all hyped up (mostly due to Rails, I guess) I cannot help but compare the situation in open-source projects in Ruby and .NET. What I personally find interesting is that .NET developers are for the most part severely suffering from the NIH syndrome and are very hesitant to use someone else's code in pretty much any shape or form. Comparing it with Ruby, I see a striking difference. Folks out there have gems literally for every little piece of functionality imaginable. New projects are popping out left and right and generally are heartily welcomed. On the .NET side we have CodePlex which I personally find to be a place where abandoned projects grow old and eventually get abandoned. Now, there certainly are several well-known and maintained projects, but the number of those pales in comparison with that of Ruby. Granted, NIH on the .NET devs part comes mostly from the fact that there are very few quality .NET projects out there, let alone projects that solve their specific needs, but even if there is such a project, it's often frowned upon and is reinvented in-house. So my question is multi-fold: Do you find my observations anywhere near being correct? If so, what are your thoughts on quality and quantitiy of OSS projects in .NET? Again, if you do agree with my thoughts on "NIH in .NET", what do you think is causing it? And finally, is it Ruby's feature set & community standpoint (dynamic language, strong focus on testing) that allows for such easy integration of third-party code?

    Read the article

  • Removing connections to network printers that no longer exist

    - by Jeff Hardy
    I'm trying to delete some network printer connections from a Windows Server 2003 (SP2) machine because the printer shares no longer exist. Windows shows the printers' status as "Printer not found on server, unable to connect"; this is expected; the printers are now on a different server. Only one machine ever connected to the printer shares; it no longer needs to and I'd like to clean it up. However, when I try to delete the connections, I get an error message: --------------------------- Remove Printer --------------------------- Printer connection cannot be removed. Operation could not be completed. --------------------------- OK --------------------------- The "solutions" I've found online seem to be more voodoo than anything (and they still don't work!). Does anybody know how I can delete these long-gone printers?

    Read the article

< Previous Page | 646 647 648 649 650 651 652 653 654 655 656 657  | Next Page >