Search Results

Search found 6591 results on 264 pages for 'community voice'.

Page 251/264 | < Previous Page | 247 248 249 250 251 252 253 254 255 256 257 258  | Next Page >

  • CodePlex Daily Summary for Thursday, September 06, 2012

    CodePlex Daily Summary for Thursday, September 06, 2012Popular Releasesmenu4web: menu4web 0.4.1 - javascript menu for web sites: This release is for those who believe that global variables are evil. menu4web has been wrapped into m4w singleton object. Added "Vertical Tabs" example which illustrates object notation.WinRT XAML Toolkit: WinRT XAML Toolkit - 1.2.1: WinRT XAML Toolkit based on the Windows 8 RTM SDK. Download the latest source from the SOURCE CODE page. For compiled version use NuGet. You can add it to your project in Visual Studio by going to View/Other Windows/Package Manager Console and entering: PM> Install-Package winrtxamltoolkit Features AsyncUI extensions Controls and control extensions Converters Debugging helpers Imaging IO helpers VisualTree helpers Samples Recent changes NOTE: Namespace changes DebugConsol...iPDC - Free Phasor Data Concentrator: iPDC-v1.3.1: iPDC suite version-1.3.1, Modifications and Bug Fixed (from v 1.3.0) New User Manual for iPDC-v1.3.1 available on websites. Bug resolved : PMU Simulator TCP connection error and hang connection for client (PDC). Now PMU Simulator (server) can communicate more than one PDCs (clients) over TCP and UDP parallely. PMU Simulator is now sending the exact data frames as mentioned in data rate by user. PMU Simulator data rate has been verified by iPDC database entries and PMU Connection Tes...Microsoft SQL Server Product Samples: Database: AdventureWorks OData Feed: The AdventureWorks OData service exposes resources based on specific SQL views. The SQL views are a limited subset of the AdventureWorks database that results in several consuming scenarios: CompanySales Documents ManufacturingInstructions ProductCatalog TerritorySalesDrilldown WorkOrderRouting How to install the sample You can consume the AdventureWorks OData feed from http://services.odata.org/AdventureWorksV3/AdventureWorks.svc. You can also consume the AdventureWorks OData fe...Desktop Google Reader: 1.4.6: Sorting feeds alphabetical is now optional (see preferences window)DotNetNuke® Community Edition CMS: 06.02.03: Major Highlights Fixed issue where mailto: links were not working when sending bulk email Fixed issue where uses did not see friendship relationships Problem is in 6.2, which does not show in the Versions Affected list above. Fixed the issue with cascade deletes in comments in CoreMessaging_Notification Fixed UI issue when using a date fields as a required profile property during user registration Fixed error when running the product in debug mode Fixed visibility issue when...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.65: Fixed null-reference error in the build task constructor.Active Forums for DotNetNuke CMS: Active Forums 5.0.0 RC: RC release of Active Forums 5.0.Droid Explorer: Droid Explorer 0.8.8.7 Beta: Bug in the display icon for apk's, will fix with next release Added fallback icon if unable to get the image/icon from the Cloud Service Removed some stale plugins that were either out dated or incomplete. Added handler for *.ab files for restoring backups Added plugin to create device backups Backups stored in %USERPROFILE%\Android Backups\%DEVICE_ID%\ Added custom folder icon for the android backups directory better error handling for installing an apk bug fixes for the Runn...BI System Monitor: v2.1: Data Audits report and supporting SQL, and SSIS package Environment Overview report enhancements, improving the appearance, addition of data audit finding indicators Note: SQL 2012 version coming soon.Hidden Capture (HC): Hidden Capture 1.1: Hidden Capture 1.1 by Mohsen E.Dawatgar http://Hidden-Capture.blogfa.comExt Spec: Ext Spec 0.2.1: Refined examples and improved distribution options.The Visual Guide for Building Team Foundation Server 2012 Environments: Version 1: --Nearforums - ASP.NET MVC forum engine: Nearforums v8.5: Version 8.5 of Nearforums, the ASP.NET MVC Forum Engine. New features include: Built-in search engine using Lucene.NET Flood control improvements Notifications improvements: sync option and mail body View Roadmap for more details webdeploy package sha1 checksum: 961aff884a9187b6e8a86d68913cdd31f8deaf83WiX Toolset: WiX Toolset v3.6: WiX Toolset v3.6 introduces the Burn bootstrapper/chaining engine and support for Visual Studio 2012 and .NET Framework 4.5. Other minor functionality includes: WixDependencyExtension supports dependency checking among MSI packages. WixFirewallExtension supports more features of Windows Firewall. WixTagExtension supports Software Id Tagging. WixUtilExtension now supports recursive directory deletion. Melt simplifies pure-WiX patching by extracting .msi package content and updating .w...Iveely Search Engine: Iveely Search Engine (0.2.0): ????ISE?0.1.0??,?????,ISE?0.2.0?????????,???????,????????20???follow?ISE,????,??ISE??????????,??????????,?????????,?????????0.2.0??????,??????????。 Iveely Search Engine ?0.2.0?????????“??????????”,??????,?????????,???????,???????????????????,????、????????????。???0.1.0????????????: 1. ??“????” ??。??????????,?????????,???????????????????。??:????????,????????????,??????????????????。??????。 2. ??“????”??。?0.1.0??????,???????,???????????????,?????????????,????????,?0.2.0?,???????...GmailDefaultMaker: GmailDefaultMaker 3.0.0.2: Add QQ Mail BugfixSmart Data Access layer: Smart Data access Layer Ver 3: In this version support executing inline query is added. Check Documentation section for detail.DotNetNuke® Form and List: 06.00.04: DotNetNuke Form and List 06.00.04 Don't forget to backup your installation before upgrade. Changes in 06.00.04 Fix: Sql Scripts for 6.003 missed object qualifiers within stored procedures Fix: added missing resource "cmdCancel.Text" in form.ascx.resx Changes in 06.00.03 Fix: MakeThumbnail was broken if the application pool was configured to .Net 4 Change: Data is now stored in nvarchar(max) instead of ntext Changes in 06.00.02 The scripts are now compatible with SQL Azure, tested in a ne...Coevery - Free CRM: Coevery 1.0.0.24: Add a sample database, and installation instructions.New ProjectsAny-Service: AnyService is a .net 4.0 Windows service shell. It hosts any windows application in non-gui mode to run as a service.BabyCloudDrives - the multi cloud drive desktop's application: wpf ????BLACK ORANGE: Download The HPAD TEXT EDITOR and use it Wisely.. CodePlex New Release Checker: CodePlex New Release Checker is a small library that makes it easy to add, "New Version Available!" functionality to your CodePlex project.Collect: ????????!CSVManager: CSV??CSV?????,????CSV??,??????Exam Project: My Exam Project. Computer Vision, C and OpenCV-FTP: Hey guys thanks for checking out my ftp!Haushaltsbuch: 1ModMaker.Lua: ModMaker.Lua is an open source .NET library that parses and executes Lua code.MyJabbr: MyJabbr netduinoscope: Design shield and software to use netduino as oscilloscopeNetSurveillance Web Application: Net Surveillance Web ApplicationNiconicoApiHelper: ????API?????????OStega: A simple library for encrypt text into an bmp or png image.OURORM: ormTFS Cloud Deployment Toolkit: The TFS Cloud Deployment Toolkit is a set of tools that integrate with TFS 2010 to help manage configuration and deployment to various remote environments.The Visual Guide for Building Team Foundation Server 2012 Environments: A step-by-step guide for building Team Foundation Server 2012 environments that include SharePoint Server 2010, SQL Server 2012, Windows Server 2012 and more!WinRT LineChart: An attempt at creating an usable LineChart for everyone to use in his/her own Windows 8 Apps

    Read the article

  • Why Is Vertical Resolution Monitor Resolution so Often a Multiple of 360?

    - by Jason Fitzpatrick
    Stare at a list of monitor resolutions long enough and you might notice a pattern: many of the vertical resolutions, especially those of gaming or multimedia displays, are multiples of 360 (720, 1080, 1440, etc.) But why exactly is this the case? Is it arbitrary or is there something more at work? Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-driven grouping of Q&A web sites. The Question SuperUser reader Trojandestroy recently noticed something about his display interface and needs answers: YouTube recently added 1440p functionality, and for the first time I realized that all (most?) vertical resolutions are multiples of 360. Is this just because the smallest common resolution is 480×360, and it’s convenient to use multiples? (Not doubting that multiples are convenient.) And/or was that the first viewable/conveniently sized resolution, so hardware (TVs, monitors, etc) grew with 360 in mind? Taking it further, why not have a square resolution? Or something else unusual? (Assuming it’s usual enough that it’s viewable). Is it merely a pleasing-the-eye situation? So why have the display be a multiple of 360? The Answer SuperUser contributor User26129 offers us not just an answer as to why the numerical pattern exists but a history of screen design in the process: Alright, there are a couple of questions and a lot of factors here. Resolutions are a really interesting field of psychooptics meeting marketing. First of all, why are the vertical resolutions on youtube multiples of 360. This is of course just arbitrary, there is no real reason this is the case. The reason is that resolution here is not the limiting factor for Youtube videos – bandwidth is. Youtube has to re-encode every video that is uploaded a couple of times, and tries to use as little re-encoding formats/bitrates/resolutions as possible to cover all the different use cases. For low-res mobile devices they have 360×240, for higher res mobile there’s 480p, and for the computer crowd there is 360p for 2xISDN/multiuser landlines, 720p for DSL and 1080p for higher speed internet. For a while there were some other codecs than h.264, but these are slowly being phased out with h.264 having essentially ‘won’ the format war and all computers being outfitted with hardware codecs for this. Now, there is some interesting psychooptics going on as well. As I said: resolution isn’t everything. 720p with really strong compression can and will look worse than 240p at a very high bitrate. But on the other side of the spectrum: throwing more bits at a certain resolution doesn’t magically make it better beyond some point. There is an optimum here, which of course depends on both resolution and codec. In general: the optimal bitrate is actually proportional to the resolution. So the next question is: what kind of resolution steps make sense? Apparently, people need about a 2x increase in resolution to really see (and prefer) a marked difference. Anything less than that and many people will simply not bother with the higher bitrates, they’d rather use their bandwidth for other stuff. This has been researched quite a long time ago and is the big reason why we went from 720×576 (415kpix) to 1280×720 (922kpix), and then again from 1280×720 to 1920×1080 (2MP). Stuff in between is not a viable optimization target. And again, 1440P is about 3.7MP, another ~2x increase over HD. You will see a difference there. 4K is the next step after that. Next up is that magical number of 360 vertical pixels. Actually, the magic number is 120 or 128. All resolutions are some kind of multiple of 120 pixels nowadays, back in the day they used to be multiples of 128. This is something that just grew out of LCD panel industry. LCD panels use what are called line drivers, little chips that sit on the sides of your LCD screen that control how bright each subpixel is. Because historically, for reasons I don’t really know for sure, probably memory constraints, these multiple-of-128 or multiple-of-120 resolutions already existed, the industry standard line drivers became drivers with 360 line outputs (1 per subpixel). If you would tear down your 1920×1080 screen, I would be putting money on there being 16 line drivers on the top/bottom and 9 on one of the sides. Oh hey, that’s 16:9. Guess how obvious that resolution choice was back when 16:9 was ‘invented’. Then there’s the issue of aspect ratio. This is really a completely different field of psychology, but it boils down to: historically, people have believed and measured that we have a sort of wide-screen view of the world. Naturally, people believed that the most natural representation of data on a screen would be in a wide-screen view, and this is where the great anamorphic revolution of the ’60s came from when films were shot in ever wider aspect ratios. Since then, this kind of knowledge has been refined and mostly debunked. Yes, we do have a wide-angle view, but the area where we can actually see sharply – the center of our vision – is fairly round. Slightly elliptical and squashed, but not really more than about 4:3 or 3:2. So for detailed viewing, for instance for reading text on a screen, you can utilize most of your detail vision by employing an almost-square screen, a bit like the screens up to the mid-2000s. However, again this is not how marketing took it. Computers in ye olden days were used mostly for productivity and detailed work, but as they commoditized and as the computer as media consumption device evolved, people didn’t necessarily use their computer for work most of the time. They used it to watch media content: movies, television series and photos. And for that kind of viewing, you get the most ‘immersion factor’ if the screen fills as much of your vision (including your peripheral vision) as possible. Which means widescreen. But there’s more marketing still. When detail work was still an important factor, people cared about resolution. As many pixels as possible on the screen. SGI was selling almost-4K CRTs! The most optimal way to get the maximum amount of pixels out of a glass substrate is to cut it as square as possible. 1:1 or 4:3 screens have the most pixels per diagonal inch. But with displays becoming more consumery, inch-size became more important, not amount of pixels. And this is a completely different optimization target. To get the most diagonal inches out of a substrate, you want to make the screen as wide as possible. First we got 16:10, then 16:9 and there have been moderately successful panel manufacturers making 22:9 and 2:1 screens (like Philips). Even though pixel density and absolute resolution went down for a couple of years, inch-sizes went up and that’s what sold. Why buy a 19″ 1280×1024 when you can buy a 21″ 1366×768? Eh… I think that about covers all the major aspects here. There’s more of course; bandwidth limits of HDMI, DVI, DP and of course VGA played a role, and if you go back to the pre-2000s, graphics memory, in-computer bandwdith and simply the limits of commercially available RAMDACs played an important role. But for today’s considerations, this is about all you need to know. Have something to add to the explanation? Sound off in the the comments. Want to read more answers from other tech-savvy Stack Exchange users? Check out the full discussion thread here.     

    Read the article

  • Augmenting your Social Efforts via Data as a Service (DaaS)

    - by Mike Stiles
    The following is the 3rd in a series of posts on the value of leveraging social data across your enterprise by Oracle VP Product Development Don Springer and Oracle Cloud Data and Insight Service Sr. Director Product Management Niraj Deo. In this post, we will discuss the approach and value of integrating additional “public” data via a cloud-based Data-as-as-Service platform (or DaaS) to augment your Socially Enabled Big Data Analytics and CX Management. Let’s assume you have a functional Social-CRM platform in place. You are now successfully and continuously listening and learning from your customers and key constituents in Social Media, you are identifying relevant posts and following up with direct engagement where warranted (both 1:1, 1:community, 1:all), and you are starting to integrate signals for communication into your appropriate Customer Experience (CX) Management systems as well as insights for analysis in your business intelligence application. What is the next step? Augmenting Social Data with other Public Data for More Advanced Analytics When we say advanced analytics, we are talking about understanding causality and correlation from a wide variety, volume and velocity of data to Key Performance Indicators (KPI) to achieve and optimize business value. And in some cases, to predict future performance to make appropriate course corrections and change the outcome to your advantage while you can. The data to acquire, process and analyze this is very nuanced: It can vary across structured, semi-structured, and unstructured data It can span across content, profile, and communities of profiles data It is increasingly public, curated and user generated The key is not just getting the data, but making it value-added data and using it to help discover the insights to connect to and improve your KPIs. As we spend time working with our larger customers on advanced analytics, we have seen a need arise for more business applications to have the ability to ingest and use “quality” curated, social, transactional reference data and corresponding insights. The challenge for the enterprise has been getting this data inline into an easily accessible system and providing the contextual integration of the underlying data enriched with insights to be exported into the enterprise’s business applications. The following diagram shows the requirements for this next generation data and insights service or (DaaS): Some quick points on these requirements: Public Data, which in this context is about Common Business Entities, such as - Customers, Suppliers, Partners, Competitors (all are organizations) Contacts, Consumers, Employees (all are people) Products, Brands This data can be broadly categorized incrementally as - Base Utility data (address, industry classification) Public Master Reference data (trade style, hierarchy) Social/Web data (News, Feeds, Graph) Transactional Data generated by enterprise process, workflows etc. This Data has traits of high-volume, variety, velocity etc., and the technology needed to efficiently integrate this data for your needs includes - Change management of Public Reference Data across all categories Applied Big Data to extract statics as well as real-time insights Knowledge Diagnostics and Data Mining As you consider how to deploy this solution, many of our customers will be using an online “cloud” service that provides quality data and insights uniformly to all their necessary applications. In addition, they are requesting a service that is: Agile and Easy to Use: Applications integrated with the service can obtain data on-demand, quickly and simply Cost-effective: Pre-integrated into applications so customers don’t have to Has High Data Quality: Single point access to reference data for data quality and linkages to transactional, curated and social data Supports Data Governance: Becomes more manageable and cost-effective since control of data privacy and compliance can be enforced in a centralized place Data-as-a-Service (DaaS) Just as the cloud has transformed and now offers a better path for how an enterprise manages its IT from their infrastructure, platform, and software (IaaS, PaaS, and SaaS), the next step is data (DaaS). Over the last 3 years, we have seen the market begin to offer a cloud-based data service and gain initial traction. On one side of the DaaS continuum, we see an “appliance” type of service that provides a single, reliable source of accurate business data plus social information about accounts, leads, contacts, etc. On the other side of the continuum we see more of an online market “exchange” approach where ISVs and Data Publishers can publish and sell premium datasets within the exchange, with the exchange providing a rich set of web interfaces to improve the ease of data integration. Why the difference? It depends on the provider’s philosophy on how fast the rate of commoditization of certain data types will occur. How do you decide the best approach? Our perspective, as shown in the diagram below, is that the enterprise should develop an elastic schema to support multi-domain applicability. This allows the enterprise to take the most flexible approach to harness the speed and breadth of public data to achieve value. The key tenet of the proposed approach is that an enterprise carefully federates common utility, master reference data end points, mobility considerations and content processing, so that they are pervasively available. One way you may already be familiar with this approach is in how you do Address Verification treatments for accounts, contacts etc. If you design and revise this service in such a way that it is also easily available to social analytic needs, you could extend this to launch geo-location based social use cases (marketing, sales etc.). Our fundamental belief is that value-added data achieved through enrichment with specialized algorithms, as well as applying business “know-how” to weight-factor KPIs based on innovative combinations across an ever-increasing variety, volume and velocity of data, will be where real value is achieved. Essentially, Data-as-a-Service becomes a single entry point for the ever-increasing richness and volume of public data, with enrichment and combined capabilities to extract and integrate the right data from the right sources with the right factoring at the right time for faster decision-making and action within your core business applications. As more data becomes available (and in many cases commoditized), this value-added data processing approach will provide you with ongoing competitive advantage. Let’s look at a quick example of creating a master reference relationship that could be used as an input for a variety of your already existing business applications. In phase 1, a simple master relationship is achieved between a company (e.g. General Motors) and a variety of car brands’ social insights. The reference data allows for easy sort, export and integration into a set of CRM use cases for analytics, sales and marketing CRM. In phase 2, as you create more data relationships (e.g. competitors, contacts, other brands) to have broader and deeper references (social profiles, social meta-data) for more use cases across CRM, HCM, SRM, etc. This is just the tip of the iceberg, as the amount of master reference relationships is constrained only by your imagination and the availability of quality curated data you have to work with. DaaS is just now emerging onto the marketplace as the next step in cloud transformation. For some of you, this may be the first you have heard about it. Let us know if you have questions, or perspectives. In the meantime, we will continue to share insights as we can.Photo: Erik Araujo, stock.xchng

    Read the article

  • SQL Server Developer Tools &ndash; Codename Juneau vs. Red-Gate SQL Source Control

    - by Ajarn Mark Caldwell
    So how do the new SQL Server Developer Tools (previously code-named Juneau) stack up against SQL Source Control?  Read on to find out. At the PASS Community Summit a couple of weeks ago, it was announced that the previously code-named Juneau software would be released under the name of SQL Server Developer Tools with the release of SQL Server 2012.  This replacement for Database Projects in Visual Studio (also known in a former life as Data Dude) has some great new features.  I won’t attempt to describe them all here, but I will applaud Microsoft for making major improvements.  One of my favorite changes is the way database elements are broken down.  Previously every little thing was in its own file.  For example, indexes were each in their own file.  I always hated that.  Now, SSDT uses a pattern similar to Red-Gate’s and puts the indexes and keys into the same file as the overall table definition. Of course there are really cool features to keep your database model in sync with the actual source scripts, and the rename refactoring feature is now touted as being more than just a search and replace, but rather a “semantic-aware” search and replace.  Funny, it reminds me of SQL Prompt’s Smart Rename feature.  But I’m not writing this just to criticize Microsoft and argue that they are late to the party with this feature set.  Instead, I do see it as a viable alternative for folks who want all of their source code to be version controlled, but there are a couple of key trade-offs that you need to know about when you choose which tool set to use. First, the basics Both tool sets integrate with a wide variety of source control systems including the most popular: Subversion, GIT, Vault, and Team Foundation Server.  Both tools have integrated functionality to produce objects to upgrade your target database when you are ready (DACPACs in SSDT, integration with SQL Compare for SQL Source Control).  If you regularly live in Visual Studio or the Business Intelligence Development Studio (BIDS) then SSDT will likely be comfortable for you.  Like BIDS, SSDT is a Visual Studio Project Type that comes with SQL Server, and if you don’t already have Visual Studio installed, it will install the shell for you.  If you already have Visual Studio 2010 installed, then it will just add this as an available project type.  On the other hand, if you regularly live in SQL Server Management Studio (SSMS) then you will really enjoy the SQL Source Control integration from within SSMS.  Both tool sets store their database model in script files.  In SSDT, these are on your file system like other source files; in SQL Source Control, these are stored in the folder structure in your source control system, and you can always GET them to your file system if you want to browse them directly. For me, the key differentiating factors are 1) a single, unified check-in, and 2) migration scripts.  How you value those two features will likely make your decision for you. Unified Check-In If you do a continuous-integration (CI) style of development that triggers an automated build with unit testing on every check-in of source code, and you use Visual Studio for the rest of your development, then you will want to really consider SSDT.  Because it is just another project in Visual Studio, it can be added to your existing Solution, and you can then do a complete, or unified single check-in of all changes whether they are application or database changes.  This is simply not possible with SQL Source Control because it is in a different development tool (SSMS instead of Visual Studio) and there is no way to do one unified check-in between the two.  You CAN do really fast back-to-back check-ins, but there is the possibility that the automated build that is triggered from the first check-in will cause your unit tests to fail and the CI tool to report that you broke the build.  Of course, the automated build that is triggered from the second check-in which contains the “other half” of your changes should pass and so the amount of time that the build was broken may be very, very short, but if that is very, very important to you, then SQL Source Control just won’t work; you’ll have to use SSDT. Refactoring and Migrations If you work on a mature system, or on a not-so-mature but also not-so-well-designed system, where you want to refactor the database schema as you go along, but you can’t have data suddenly disappearing from your target system, then you’ll probably want to go with SQL Source Control.  As I wrote previously, there are a number of changes which you can make to your database that the comparison tools (both from Microsoft and Red Gate) simply cannot handle without the possibility (or probability) of data loss.  Currently, SSDT only offers you the ability to inject PRE and POST custom deployment scripts.  There is no way to insert your own script in the middle to override the default behavior of the tool.  In version 3.0 of SQL Source Control (Early Access version now available) you have that ability to create your own custom migration script to take the place of the commands that the tool would have done, and ensure the preservation of your data.  Or, even if the default tool behavior would have worked, but you simply know a better way then you can take control and do things your way instead of theirs. You Decide In the environment I work in, our automated builds are not triggered off of check-ins, but off of the clock (currently once per night) and so there is no point at which the automated build and unit tests will be triggered without having both sides of the development effort already checked-in.  Therefore having a unified check-in, while handy, is not critical for us.  As for migration scripts, these are critically important to us.  We do a lot of new development on systems that have already been in production for years, and it is not uncommon for us to need to do a refactoring of the database.  Because of the maturity of the existing system, that often involves data migrations or other additional SQL tasks that the comparison tools just can’t detect on their own.  Therefore, the ability to create a custom migration script to override the tool’s default behavior is very important to us.  And so, you can see why we will continue to use Red Gate SQL Source Control for the foreseeable future.

    Read the article

  • Advice for a distracted, unhappy, recently graduated programmer? [closed]

    - by Re-Invent
    I graduated 4 months ago. I had offers from a few good places to work at. At the same time I wanted to stick to building a small software business of my own, still have some ideas with good potential, some half done projects frozen in my github. But due to social pressures, I chose a job, the pay is great, but I am half-passionate about it. A small team of smart folks building useful product, working out contracts across the world. I've started finding it extremely boring. Boring to the extent that I skip 2-3 days a week together not doing work. Neither do I spend that time progressing any of my own projects. Yes, I feel stupid at the way I'm wasting time, but I don't understand exactly why is it happening. It's as if all the excitement has been drained. What can I do about it? Long version: School - I was in third standard. Only students, 6th grade had access to computer labs. I once peeked into the lab from the little door opening. No hard-disks, MS DOS on 5 1/2 inch floppies. I asked a senior student to play some sound in BASIC. He used PLAY to compose a tune. Boy! I was so excited, I was jumping from within. Back home, asked my brother to teach me some programming. We bought a book "MODERN All About GW-BASIC for Schools & Colleges". The book had everything, right from printing, to taking input, file i/o, game programming, machine level support, etc. I was in 6th standard, wrote my first game - a wheel of fortune, rotated the wheel by manipulating 16 color palette's definition. Got internet soon, got hooked to QuickBasic programming community. Made some more games "007 in Danger", "Car Crush 2" for submission to allbasiccode archives. I was extremely excited about all this. My interests now swayed into "hacking" (computer security). Taught myself some perl, found it annoying, learnt PHP and a bit of SQL. Also taught myself Visual Basic one of the winters and wrote a pacman clone with Direct X. By the time I was in 10th standard, I created some evil tools using visual basic, php and mysql and eventually landed myself into an unpaid side-job at a government facility, building evil tools for them. It was a dream come true for crackers of that time. And so was I, still very excited. Things changed soon, last two years of school were not so great as I was balancing preps for college, work at govt. and studies for school at same time. College - College was opposite of all I had wished it to be. I imagined it to be a place where I'd spend my 4 years building something awesome. It was rather an epitome of rote learning, attendance, rules, busy schedules, ban on personal laptops, hardly any hackers surrounding you and shit like that. We had to take permissions to even introduce some cultural/creative activities in our annual schedule. The labs won't be open on weekends because the lab employees had to have their leaves. Yes, a horrible place for someone like me. I still managed to pull out a project with a friend over 2 months. Showed it to people high in the academia hierarchy. They were immensely impressed, we proposed to allow personal computers for students. They made up half-assed reasons and didn't agree. We felt frustrated. And so on, I still managed to teach myself new languages, do new projects of my own, do an intern at the same govt. facility, start a small business for sometime, give a talk at a conference I'm passionate about, win game-dev and hacking contest at most respected colleges, solve good deal of programming contest problems, etc. At the same time I was not content with all these restrictions, great emphasis on rote learning, and sheer wastage of time due to college. I never felt I was overdoing, but now I feel I burnt myself out. During my last days at college, I did an intern at a bigco. While I spent my time building prototypes for certain LBS, the other interns around me, even a good friend, was just skipping time. I thought maybe, in a few weeks he would put in some serious efforts at work assigned to him, but all he did was to find creative ways to skip work, hide his face from manager, engage people in talks if they try to question his progress, etc. I tried a few time to get him on track, but it seems all he wanted was to "not to work hard at all and still reap the fruits". I don't know how others take such people, but I find their vicinity very very poisonous to one's own motivation and productivity. Over that, the place where I come from, HRs don't give much value to what have you done past 4 years. So towards the end of out intern, we all were offered work at the bigco, but the slacker, even after not writing more than 200 lines of code was made a much better offer. I felt enraged instantly - "Is this how the corp world treats someone who does fruitful, if not extra-ordinary work form them for past 6 months?". Yes, I did try to negotiate and debate. The bigcos seem blind due to departmentalization of responsibilities and many layers of management. I decided not to be in touch with any characters of that depressing play. Probably the busy time I had at college, ignoring friends, ignoring fun and squeezing every bit of free time for myself is also responsible. Probably this is what has drained all my willingness to work for anyone. I find my day job boring, at the same time I with to maintain it for financial reasons. I feel a bit burnt out, unsatisfied and at the same time an urge to quit working for someone else and start finishing my frozen side-projects (which may be profitable). Though I haven't got much to support myself with food, office, internet bills, etc in savings. I still have my day job, but I don't find it very interesting, even though the pay is higher than the slacker, I don't find money to be a great motivator here. I keep comparing myself to my past version. I wonder how to get rid of this and reboot myself back to the way I was in school days - excited about it, tinkering, building, learning new things daily, and NOT BORED?

    Read the article

  • Building an OpenStack Cloud for Solaris Engineering, Part 1

    - by Dave Miner
    One of the signature features of the recently-released Solaris 11.2 is the OpenStack cloud computing platform.  Over on the Solaris OpenStack blog the development team is publishing lots of details about our version of OpenStack Havana as well as some tips on specific features, and I highly recommend reading those to get a feel for how we've leveraged Solaris's features to build a top-notch cloud platform.  In this and some subsequent posts I'm going to look at it from a different perspective, which is that of the enterprise administrator deploying an OpenStack cloud.  But this won't be just a theoretical perspective: I've spent the past several months putting together a deployment of OpenStack for use by the Solaris engineering organization, and now that it's in production we'll share how we built it and what we've learned so far.In the Solaris engineering organization we've long had dedicated lab systems dispersed among our various sites and a home-grown reservation tool for developers to reserve those systems; various teams also have private systems for specific testing purposes.  But as a developer, it can still be difficult to find systems you need, especially since most Solaris changes require testing on both SPARC and x86 systems before they can be integrated.  We've added virtual resources over the years as well in the form of LDOMs and zones (both traditional non-global zones and the new kernel zones).  Fundamentally, though, these were all still deployed in the same model: our overworked lab administrators set up pre-configured resources and we then reserve them.  Sounds like pretty much every traditional IT shop, right?  Which means that there's a lot of opportunity for efficiencies from greater use of virtualization and the self-service style of cloud computing.  As we were well into development of OpenStack on Solaris, I was recruited to figure out how we could deploy it to both provide more (and more efficient) development and test resources for the organization as well as a test environment for Solaris OpenStack.At this point, let's acknowledge one fact: deploying OpenStack is hard.  It's a very complex piece of software that makes use of sophisticated networking features and runs as a ton of service daemons with myriad configuration files.  The web UI, Horizon, doesn't often do a good job of providing detailed errors.  Even the command-line clients are not as transparent as you'd like, though at least you can turn on verbose and debug messaging and often get some clues as to what to look for, though it helps if you're good at reading JSON structure dumps.  I'd already learned all of this in doing a single-system Grizzly-on-Linux deployment for the development team to reference when they were getting started so I at least came to this job with some appreciation for what I was taking on.  The good news is that both we and the community have done a lot to make deployment much easier in the last year; probably the easiest approach is to download the OpenStack Unified Archive from OTN to get your hands on a single-system demonstration environment.  I highly recommend getting started with something like it to get some understanding of OpenStack before you embark on a more complex deployment.  For some situations, it may in fact be all you ever need.  If so, you don't need to read the rest of this series of posts!In the Solaris engineering case, we need a lot more horsepower than a single-system cloud can provide.  We need to support both SPARC and x86 VM's, and we have hundreds of developers so we want to be able to scale to support thousands of VM's, though we're going to build to that scale over time, not immediately.  We also want to be able to test both Solaris 11 updates and a release such as Solaris 12 that's under development so that we can work out any upgrade issues before release.  One thing we don't have is a requirement for extremely high availability, at least at this point.  We surely don't want a lot of down time, but we can tolerate scheduled outages and brief (as in an hour or so) unscheduled ones.  Thus I didn't need to spend effort on trying to get high availability everywhere.The diagram below shows our initial deployment design.  We're using six systems, most of which are x86 because we had more of those immediately available.  All of those systems reside on a management VLAN and are connected with a two-way link aggregation of 1 Gb links (we don't yet have 10 Gb switching infrastructure in place, but we'll get there).  A separate VLAN provides "public" (as in connected to the rest of Oracle's internal network) addresses, while we use VxLANs for the tenant networks. One system is more or less the control node, providing the MySQL database, RabbitMQ, Keystone, and the Nova API and scheduler as well as the Horizon console.  We're curious how this will perform and I anticipate eventually splitting at least the database off to another node to help simplify upgrades, but at our present scale this works.I had a couple of systems with lots of disk space, one of which was already configured as the Automated Installation server for the lab, so it's just providing the Glance image repository for OpenStack.  The other node with lots of disks provides Cinder block storage service; we also have a ZFS Storage Appliance that will help back-end Cinder in the near future, I just haven't had time to get it configured in yet.There's a separate system for Neutron, which is our Elastic Virtual Switch controller and handles the routing and NAT for the guests.  We don't have any need for firewalling in this deployment so we're not doing so.  We presently have only two tenants defined, one for the Solaris organization that's funding this cloud, and a separate tenant for other Oracle organizations that would like to try out OpenStack on Solaris.  Each tenant has one VxLAN defined initially, but we can of course add more.  Right now we have just a single /24 network for the floating IP's, once we get demand up to where we need more then we'll add them.Finally, we have started with just two compute nodes; one is an x86 system, the other is an LDOM on a SPARC T5-2.  We'll be adding more when demand reaches the level where we need them, but as we're still ramping up the user base it's less work to manage fewer nodes until then.My next post will delve into the details of building this OpenStack cloud's infrastructure, including how we're using various Solaris features such as Automated Installation, IPS packaging, SMF, and Puppet to deploy and manage the nodes.  After that we'll get into the specifics of configuring and running OpenStack itself.

    Read the article

  • A Bite With No Teeth&ndash;Demystifying Non-Compete Clauses

    - by D'Arcy Lussier
    *DISCLAIMER: I am not a lawyer and this post in no way should be considered legal advice. I’m also in Canada, so references made are to Canadian court cases. I received a signed letter the other day, a reminder from my previous employer about some clauses associated with my employment and entry into an employee stock purchase program. So since this is in effect for the next 12 months, I guess I’m not starting that new job tomorrow. I’m kidding of course. How outrageous, how presumptuous, pompous, and arrogant that a company – any company – would actually place these conditions upon an employee. And yet, this is not uncommon. Especially in the IT industry, we see time and again similar wording in our employment agreements. But…are these legal? Is there any teeth behind the threat of the bite? Luckily, the answer seems to be ‘No’. I want to highlight two cases that support this. The first is Lyons v. Multari. In a nutshell, Dentist hires younger Dentist to be an associate. In their short, handwritten agreement, a non-compete clause was written stating “Protective Covenant. 3 yrs. – 5mi” (meaning you can’t set up shop within 5 miles for 3 years). Well, the young dentist left and did start an oral surgery office within 5 miles and within 3 years. Off to court they go! The initial judge sided with the older dentist, but on appeal it was overturned. Feel free to read the transcript of the decision here, but let me highlight one portion from section [19]: The general rule in most common law jurisdictions is that non-competition clauses in employment contracts are void. The sections following [19] explain further, and discuss Elsley v. J.G. Collins Insurance Agency Ltd. and its impact on Canadian law in this regard. The second case is Winnipeg Livestock Sales Ltd. v. Plewman. Desmond Plewman is an auctioneer, and worked at Winnipeg Livestock Sales. Part of his employment agreement was that he could not work for a competitor for 18 months if he left the company. Well, he left, and took up an important role in a competing company. The case went to court and as with Lyons v. Multari, the initial judge found in favour of the plaintiffs. Also as in the first case, that was overturned on appeal. Again, read through the transcript of the decision, but consider section [28]: In other words, even though Plewman has a great deal of skill as an auctioneer, Winnipeg Livestock has no proprietary interest in his professional skill and experience, even if they were acquired during his time working for Winnipeg Livestock.  Thus, Winnipeg Livestock has the burden of establishing that it has a legitimate proprietary interest requiring protection.  On this key question there is little evidence before the Court.  The record discloses that part of Plewman’s job was to “mingle with the … crowd” and to telephone customers and prospective customers about future prospects for the sale of livestock.  It may seem reasonable to assume that Winnipeg Livestock has a legitimate proprietary interest in its customer connections; but there is no evidence to indicate that there is any significant degree of “customer loyalty” in the business, as opposed to customers making choices based on other considerations such as cost, availability and the like. So are there any incidents where a non-compete can actually be valid? Yes, and these are considered “exceptional” cases, meaning that the situation meets certain circumstances. Michael Carabash has a great blog series discussing the above mentioned cases as well as the difference between a non-compete and non-solicit agreement. He talks about the exceptional criteria: In summary, the authorities reveal that the following circumstances will generally be relevant in determining whether a case is an “exceptional” one so that a general non-competition clause will be found to be reasonable: - The length of service with the employer. - The amount of personal service to clients. - Whether the employee dealt with clients exclusively, or on a sustained or     recurring basis. - Whether the knowledge about the client which the employee gained was of a   confidential nature, or involved an intimate knowledge of the client’s   particular needs, preferences or idiosyncrasies. - Whether the nature of the employee’s work meant that the employee had   influence over clients in the sense that the clients relied upon the employee’s   advice, or trusted the employee. - If competition by the employee has already occurred, whether there is   evidence that clients have switched their custom to him, especially without   direct solicitation. - The nature of the business with respect to whether personal knowledge of   the clients’ confidential matters is required. - The nature of the business with respect to the strength of customer loyalty,   how clients are “won” and kept, and whether the clientele is a recurring one. - The community involved and whether there were clientele yet to be exploited   by anyone. I close this blog post with a final quote, one from Zvulony & Co’s blog post on this subject. Again, all of this is not official legal advice, but I think we can see what all these sources are pointing towards. To answer my earlier question, there’s no teeth behind the threat of the bite. In light of this list, and the decisions in Lyons and Orlan, it is reasonably certain that in most employment situations a non-competition clause will be ineffective in protecting an employer from a departing employee who wishes to compete in the same business. The Courts have been relatively consistent in their position that if a non-solicitation clause can protect an employer’s interests, then a non-competition clause is probably unreasonable. Employers (or their solicitors) should avoid the inclination to draft restrictive covenants in broad, catch-all language. Or in other words, when drafting a restrictive covenant – take only what you need! D

    Read the article

  • PASS Summit 2010 BI Workshop Feedbacks

    - by Davide Mauri
    As many other speakers already did, I’d like to share with the SQL Community the feedback of my PASS Summit 2010 Workshop. For those who were not there, my workshop was the “BI From A-Z” and the main objective of that workshop was to introduce people in the BI world not only from a technical point of view but insist a lot on the methodological and “engineered” approach. The will to put more engineering in the IT (and specially in the BI field) is something that has been growing stronger and stronger in me every day for of this last 5 years since is simply envy the fact that Airbus, Fincatieri, BMW (just to name a few) can create very complex machine “just” using putting people together and giving them some rules to follow (Of course this is an oversimplification but I think you get what I mean). The key point of engineering is that, after having defined the project blueprint, you have the possibility to give to a huge number of people, the rules to follow, the correct tools in order to implement the rules easily and semi-automatically and a way to measure the quality of the results. Could this be done in IT? Very big question, so my scope is now limited to BI. So that’s the main point of my workshop: and entry-level approach to BI (level was 200) in order to allow attendees to know the basics, to understand what tools they should use for which purpose and, above all, a set of rules and tools in order to make a BI solution scalable in terms of people working on it, while still maintaining a very good quality. All done not focusing only on the practice but explaining the theory behind to see how it can help *a lot* to build a correct solution despite the technology used to implement it. The idea is to reach a point where more then 70% of the work done to create a BI solution can be reused even if technologies changes. This is a very demanding challenge nowadays with the coming of Denali and its column-aligned storage and the shiny-new DAX language. As you may understand I was looking forward to get the feedback since you may have noticed that there’s a lot of “architectural” stuff in IT but really nothing on “engineering”. So how the session could be perceived by the attendees was really unknown to me. The feedback could also give a good indication if the need of more “engineering” is something I feel only by myself or if is something more broad. I’m very happy to be able to say that the overall score of 4.75 put my workshop in the TOP 20 session (on near 200 sessions)! Here’s the detailed evaluations: How would you rate the usefulness of the information presented in your day-to-day environment? 4.75 Answer:    # of Responses 3    1         4    12        5    42               How would you rate the Speaker's presentation skills? 4.80 Answer:    # of Responses 3 : 1         4 : 9         5 : 45               How would you rate the Speaker's knowledge of the subject? 4.95 Answer:    # of Responses 4 :  3         5 : 52               How would you rate the accuracy of the session title, description and experience level to the actual session? 4.75 Answer:    # of Responses 3 : 2         4 : 10         5 : 43               How would you rate the amount of time allocated to cover the topic/session? 4.44 Answer:    # of Responses 3 : 7         4 : 17        5 : 31               How would you rate the quality of the presentation materials? 4.62 Answer:    # of Responses 4 : 21        5 : 34 The comments where all very positive. Many of them asked for more time on the subject (or to shorten the very last topics). I’ll make treasure of these comments and will review the content accordingly. We’ll organize a two-day classes on this topic, where also more examples will be shown and some arguments will be explained more deeply. I’d just like to answer a comment that asks how much of what I shown is “universally applicable”. I can tell you that all of our BI project follow these rules and they’ve been applied to different markets (Insurance, Fashion, GDO) with different people and different teams and they allowed us to be “Adaptive” against the customer. The more the rules are well defined and the more there are tools that supports their implementations, the easier is to add new people to the project and to add or change solution features. Think of a car. How come that almost any mechanic can help you to fix a problem? Because they know what to expect. Because there a rules that allow them to identify the problem without having to discover each time how the car has been implemented build. And this is of course also true for car upgrades/improvements. Last but not least: thanks a lot to everyone for coming!

    Read the article

  • Finding nuggets in ARC discussions

    - by alanc
    A bit over twenty years ago, Sun formed an Architecture Review Committee (ARC) that evaluates proposals to change interfaces between components in Sun software products. During the OpenSolaris days, we opened many of these discussions to the community. While they’re back behind closed doors, and at a different company now, we still continue to hold these reviews for the software from what’s now the Sun Systems Group division of Oracle. Recently one of these reviews was held (via e-mail discussion) to review a proposal to update our GNU findutils package to the latest upstream release. One of the upstream changes discussed was the addition of an “oldfind” program. In findutils 4.3, find was modified to use the fts() function to walk the directory tree, and oldfind was created to provide the old mechanism in case there were bugs in the new implementation that users needed to workaround. In Solaris 11 though, we still ship the find descended from SVR4 as /usr/bin/find and the GNU find is available as either /usr/bin/gfind or /usr/gnu/bin/find. This raised the discussion of if we should add oldfind, and if so what should we call it. Normally our policy is to only add the g* names for GNU commands that conflict with an existing Solaris command – for instance, we ship /usr/bin/emacs, not /usr/bin/gemacs. In this case however, that seemed like it would be more confusing to have /usr/bin/oldfind be the older version of /usr/bin/gfind not of /usr/bin/find. Thus if we shipped it, it would make more sense to call it /usr/bin/goldfind, which several ARC members noted read more naturally as “gold find” than as “g old find”. One of the concerns we often discuss in ARC is if a change is likely to be understood by users or if it will result in more calls to support. As we hit this part of the discussion on a Friday at the end of a long week, I couldn’t resist putting forth a hypothetical support call for this command: “Hello, Oracle Solaris Support, how may I help you?” “My admin is out sick, but he sent an email that he put the findutils package on our server, and I can run goldfind now. I tried it, but goldfind didn’t find gold.” “Did he get the binutils package too?” “No he just said findutils, do we need binutils?” “Well, gold comes in the binutils package, so goldfind would be able to find gold if you got that package.” “How much does Oracle charge for that package?” “It’s free for Solaris users.” “You mean Oracle ships packages of gold to customers for free?” “Yes, if you get the binutils package, it includes GNU gold.” “New gold? Is that some sort of alchemy, turning stuff into gold?” “Not new gold, gold from the GNU project.” “Oracle’s taking gold from the GNU project and shipping it to me?” “Yes, if you get binutils, that package includes gold along with the other tools from the GNU project.” “And GNU doesn’t mind Oracle taking their gold and giving it to customers?” “No, GNU is a non-profit whose goal is to share their software.” “Sharing software sure, but gold? Where does a non-profit like GNU get gold anyway?” “Oh, Google donated it to them.” “Ah! So Oracle will give me the gold that GNU got from Google!” “Yes, if you get the package from us.” “How do I get the package with the gold?” “Just run pkg install binutils and it will put it on your disk.” “We’ve got multiple disks here - which one will it put it on?” “The one with the system image - do you know which one that is? “Well the note from the admin says the system is on the first disk and the users are on the second disk.” “Okay, so it should go on the first disk then.” “And where will I find the gold?” “It will be in the /usr/bin directory.” “In the user’s bin? So thats on the second disk?” “No, it would be on the system disk, with the other development tools, like make, as, and what.” “So what’s on the first disk?” “Well if the system image is there the commands should all be there.” “All the commands? Not just what?” “Right, all the commands that come with the OS, like the shell, ps, and who.” “So who’s on the first disk too?” “Yes. Did your admin say when he’d be back?” “No, just that he had a massive headache and was going home after I tried to get him to explain this stuff to me.” “I can’t imagine why.” “Oh, is why a command too?” “No, _why was a Ruby programmer.” “Ruby? Do you give those away with the gold too?” “Yes, but it comes in the ruby package, not binutils.” “Oh, I’ll have to have my admin get that package too! Thanks!” Needless to say, we decided this might not be the best idea. Since the GNU package hasn’t had to release a serious bug fix in the new find in the past few years, the new GNU find seems pretty stable, and we always have the SVR4 find to use as a fallback in Solaris, so it didn’t seem that adding oldfind was really necessary, so we passed on including it when we update to the new findutils release. [Apologies to Abbott, Costello, their fans, and everyone who read this far. The Gold (linker) page on Wikipedia may explain some of the above, but can’t explain why goldfind is the old GNU find, but gold is the new GNU ld.]

    Read the article

  • CodePlex Daily Summary for Wednesday, September 05, 2012

    CodePlex Daily Summary for Wednesday, September 05, 2012Popular ReleasesDesktop Google Reader: 1.4.6: Sorting feeds alphabetical is now optional (see preferences window)DotNetNuke® Community Edition CMS: 06.02.03: Major Highlights Fixed issue where mailto: links were not working when sending bulk email Fixed issue where uses did not see friendship relationships Problem is in 6.2, which does not show in the Versions Affected list above. Fixed the issue with cascade deletes in comments in CoreMessaging_Notification Fixed UI issue when using a date fields as a required profile property during user registration Fixed error when running the product in debug mode Fixed visibility issue when...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.65: Fixed null-reference error in the build task constructor.BLACK ORANGE: HPAD TEXT EDITOR 0.9 Beta: HOW TO RUN THE TEXT EDITOR Download the HPAD ARCHIVED FILES which is in .rar format Extract using Winrar Make sure that extracted files are in the same folder Double-Click on HPAD.exe application fileTelerikMvcGridCustomBindingHelper: Version 1.0.15.247-RC2: TelerikMvcGridCustomBindingHelper 1.0.15.247 RC2 Release notes: This is a RC version (hopefully the last one), please test and report any error or problem you encounter. This release is all about performance and fixes Support: "Or" and "Does Not contain" filter options Improved BooleanSubstitutes, Custom Aggregates and expressions-to-queryover Add EntityFramework examples in ExampleWebApplication Many other improvements and fixes Fix invalid cast on CustomAggregates Support for ...ServiceMon - Extensible Real-time, Service Monitoring Utility: ServiceMon Release 0.9.0.44: Auto-uploaded from build serverJavaScript Grid: Release 09-05-2012: Release 09-05-2012xUnit.net Contrib: xunitcontrib-dotCover 0.6.1 (dotCover 2.1 beta): xunitcontrib release 0.6.1 for dotCover 2.1 beta This release provides a test runner plugin for dotCover 2.1 beta, targetting all versions of xUnit.net. (See the xUnit.net project to download xUnit.net itself.) This release adds support for running xUnit.net tests to dotCover 2.1 beta's Visual Studio plugin. PLEASE NOTE: You do NOT need this if you also have ReSharper and the existing 0.6.1 release installed. DotCover will use ReSharper's test runners if available. This release includes th...B INI Sharp Library: B INI Sharp Library v1.0.0.0 Realsed: The frist realsedActive Forums for DotNetNuke CMS: Active Forums 5.0.0 RC: RC release of Active Forums 5.0.Droid Explorer: Droid Explorer 0.8.8.7 Beta: Bug in the display icon for apk's, will fix with next release Added fallback icon if unable to get the image/icon from the Cloud Service Removed some stale plugins that were either out dated or incomplete. Added handler for *.ab files for restoring backups Added plugin to create device backups Backups stored in %USERPROFILE%\Android Backups\%DEVICE_ID%\ Added custom folder icon for the android backups directory better error handling for installing an apk bug fixes for the Runn...BI System Monitor: v2.1: Data Audits report and supporting SQL, and SSIS package Environment Overview report enhancements, improving the appearance, addition of data audit finding indicators Note: SQL 2012 version coming soon.The Visual Guide for Building Team Foundation Server 2012 Environments: Version 1: --Nearforums - ASP.NET MVC forum engine: Nearforums v8.5: Version 8.5 of Nearforums, the ASP.NET MVC Forum Engine. New features include: Built-in search engine using Lucene.NET Flood control improvements Notifications improvements: sync option and mail body View Roadmap for more details webdeploy package sha1 checksum: 961aff884a9187b6e8a86d68913cdd31f8deaf83WiX Toolset: WiX Toolset v3.6: WiX Toolset v3.6 introduces the Burn bootstrapper/chaining engine and support for Visual Studio 2012 and .NET Framework 4.5. Other minor functionality includes: WixDependencyExtension supports dependency checking among MSI packages. WixFirewallExtension supports more features of Windows Firewall. WixTagExtension supports Software Id Tagging. WixUtilExtension now supports recursive directory deletion. Melt simplifies pure-WiX patching by extracting .msi package content and updating .w...Iveely Search Engine: Iveely Search Engine (0.2.0): ????ISE?0.1.0??,?????,ISE?0.2.0?????????,???????,????????20???follow?ISE,????,??ISE??????????,??????????,?????????,?????????0.2.0??????,??????????。 Iveely Search Engine ?0.2.0?????????“??????????”,??????,?????????,???????,???????????????????,????、????????????。???0.1.0????????????: 1. ??“????” ??。??????????,?????????,???????????????????。??:????????,????????????,??????????????????。??????。 2. ??“????”??。?0.1.0??????,???????,???????????????,?????????????,????????,?0.2.0?,???????...GmailDefaultMaker: GmailDefaultMaker 3.0.0.2: Add QQ Mail BugfixSmart Data Access layer: Smart Data access Layer Ver 3: In this version support executing inline query is added. Check Documentation section for detail.DotNetNuke® Form and List: 06.00.04: DotNetNuke Form and List 06.00.04 Don't forget to backup your installation before upgrade. Changes in 06.00.04 Fix: Sql Scripts for 6.003 missed object qualifiers within stored procedures Fix: added missing resource "cmdCancel.Text" in form.ascx.resx Changes in 06.00.03 Fix: MakeThumbnail was broken if the application pool was configured to .Net 4 Change: Data is now stored in nvarchar(max) instead of ntext Changes in 06.00.02 The scripts are now compatible with SQL Azure, tested in a ne...Coevery - Free CRM: Coevery 1.0.0.24: Add a sample database, and installation instructions.New ProjectsA Simple Eng-Hindi CMS: A simple English- Hindi dual language content management system for small business/personal websites.Active Social Migrator: This project for managing the Active Social migration tool.ANSI Console User Control: Custom console control for .NET WinformsAutoSPInstallerGUI: GUI Configuration Tool for SPAutoInstaller Codeplex ProjectCode Documentation Checkin Policy: This checkin policy for Visual Studio 2012 checks if c# code is documented the way it's configured in the config of the policy. Code Dojo/Kata - Free Time Coding: Doing some katas of the Coding Dojo page. http://codingdojo.org/cgi-bin/wiki.pl?KataCataloguefjycUnifyShow: fjycUnifyShowHidden Capture (HC): HC is simple and easy utility to hidden and auto capture desktop or active windowHRC Integration Services: Fake SQL Server Integration Services. LOLKooboo CMS Sites Switcher: Kooboo CMS Sites SwitcherMod.CookieDetector: Orchard module for detecting whether cookies are enabledMyCodes: Created!MySQL Statement Monitor: MySQL Statement Monitor is a monitoring tool that monitors SQL statements transferred over the network.NeoModulusPIRandom: The idea with PI Random is to use easy string manipulation and simple math to generate a pseudo random number. Net Core Tech - Medical Record System: This is a Medical Record System ProjectOraPowerShell: PowerShell library for backup and maintenance of a Oracle Database environment under Microsoft Windows 2008PinDNN: PinDNN is a module that imparts Pinterest-like functionality to DotNetNuke sites. This module works with a MongoDB database and uses the built-in social relatioPyrogen Code Generator: PyroGen is a simple code generator accepting C# as the markup language.restMs: wil be deleted soonScript.NET: Script.NET is a script management utility for web forms and MVC, using ScriptJS-like features to link dependencies between scripts.SpringExample-Pagination: Simple Spring example with PaginationXNA and Component Based Design: This project includes code for XNA and Component Based Design

    Read the article

  • Java EE 7 Survey Results!

    - by reza_rahman
    On November 8th, the Java EE EG posted a survey to gather broad community feedback on a number of critical open issues. For reference, you can find the original survey here. We kept the survey open for about three weeks until November 30th. To our delight, over 1100 developers took time out of their busy lives to let their voices be heard! The results of the survey were sent to the EG on December 12th. The subsequent EG discussion is available here. The exact summary sent to the EG is available here. We would like to take this opportunity to thank each and every one the individuals who took the survey. It is very appreciated, encouraging and worth it's weight in gold. In particular, I tried to capture just some of the high-quality, intelligent, thoughtful and professional comments in the summary to the EG. I highly encourage you to continue to stay involved, perhaps through the Adopt-a-JSR program. We would also like to sincerely thank java.net, JavaLobby, TSS and InfoQ for helping spread the word about the survey. Below is a brief summary of the results... APIs to Add to Java EE 7 Full/Web Profile The first question asked which of the four new candidate APIs (WebSocket, JSON-P, JBatch and JCache) should be added to the Java EE 7 Full and Web profile respectively. As the following graph shows, there was significant support for adding all the new APIs to the full profile: Support is relatively the weakest for Batch 1.0, but still good. A lot of folks saw WebSocket 1.0 as a critical technology with comments such as this one: "A modern web application needs Web Sockets as first class citizens" While it is clearly seen as being important, a number of commenters expressed dissatisfaction with the lack of a higher-level JSON data binding API as illustrated by this comment: "How come we don't have a Data Binding API for JSON" JCache was also seen as being very important as expressed with comments like: "JCache should really be that foundational technology on which other specs have no fear to depend on" The results for the Web Profile is not surprising. While there is strong support for adding WebSocket 1.0 and JSON-P 1.0 to the Web Profile, support for adding JCache 1.0 and Batch 1.0 is relatively weak. There was actually significant opposition to adding Batch 1. 0 (with 51.8% casting a 'No' vote). Enabling CDI by Default The second question asked was whether CDI should be enabled in Java EE environments by default. A significant majority of 73.3% developers supported enabling CDI, only 13.8% opposed. Comments such as these two reflect a strong general support for CDI as well as a desire for better Java EE alignment with CDI: "CDI makes Java EE quite valuable!" "Would prefer to unify EJB, CDI and JSF lifecycles" There is, however, a palpable concern around the performance impact of enabling CDI by default as exemplified by this comment: "Java EE projects in most cases use CDI, hence it is sensible to enable CDI by default when creating a Java EE application. However, there are several issues if CDI is enabled by default: scanning can be slow - not all libs use CDI (hence, scanning is not needed)" Another significant concern appears to be around backwards compatibility and conflict with other JSR 330 implementations like Spring: "I am leaning towards yes, however can easily imagine situations where errors would be caused by automatically activating CDI, especially in cases of backward compatibility where another DI engine (such as Spring and the like) happens to use the same mechanics to inject dependencies and in that case there would be an overlap in injections and probably an uncertain outcome" Some commenters such as this one attempt to suggest solutions to these potential issues: "If you have Spring in use and use javax.inject.Inject then you might get some unexpected behavior that could be equally confusing. I guess there will be a way to switch CDI off. I'm tempted to say yes but am cautious for this reason" Consistent Usage of @Inject The third question was around using CDI/JSR 330 @Inject consistently vs. allowing JSRs to create their own injection annotations. A slight majority of 53.3% developers supported using @Inject consistently across JSRs. 28.8% said using custom injection annotations is OK, while 18.0% were not sure. The vast majority of commenters were strongly supportive of CDI and general Java EE alignment with CDI as illistrated by these comments: "Dependency Injection should be standard from now on in EE. It should use CDI as that is the DI mechanism in EE and is quite powerful. Having a new JSR specific DI mechanism to deal with just means more reflection, more proxies. JSRs should also be constructed to allow some of their objects Injectable. @Inject @TransactionalCache or @Inject @JMXBean etc...they should define the annotations and stereotypes to make their code less procedural. Dog food it. If there is a shortcoming in CDI for a JSR fix it and we will all be grateful" "We're trying to make this a comprehensive platform, right? Injection should be a fundamental part of the platform; everything else should build on the same common infrastructure. Each-having-their-own is just a recipe for chaos and having to learn the same thing 10 different ways" Expanding the Use of @Stereotype The fourth question was about expanding CDI @Stereotype to cover annotations across Java EE beyond just CDI. A significant majority of 62.3% developers supported expanding the use of @Stereotype, only 13.3% opposed. A majority of commenters supported the idea as well as the theme of general CDI/Java EE alignment as expressed in these examples: "Just like defining new types for (compositions of) existing classes, stereotypes can help make software development easier" "This is especially important if many EJB services are decoupled from the EJB component model and can be applied via individual annotations to Java EE components. @Stateless is a nicely compact annotation. Code will not improve if that will have to be applied in the future as @Transactional, @Pooled, @Secured, @Singlethreaded, @...." Some, however, expressed concerns around increased complexity such as this commenter: "Could be very convenient, but I'm afraid if it wouldn't make some important class annotations less visible" Expanding Interceptor Use The final set of questions was about expanding interceptors further across Java EE... A very solid 96.3% of developers wanted to expand interceptor use to all Java EE components. 35.7% even wanted to expand interceptors to other Java EE managed classes. Most developers (54.9%) were not sure if there is any place that injection is supported that should not support interceptors. 32.8% thought any place that supports injection should also support interceptors. Only 12.2% were certain that there are places where injection should be supported but not interceptors. The comments reflected the diversity of opinions, generally supportive of interceptors: "I think interceptors are as fundamental as injection and should be available anywhere in the platform" "The whole usage of interceptors still needs to take hold in Java programming, but it is a powerful technology that needs some time in the Sun. Basically it should become part of Java SE, maybe the next step after lambas?" A distinct chain of thought separated interceptors from filters and listeners: "I think that the Servlet API already provides a rich set of possibilities to hook yourself into different Servlet container events. I don't find a need to 'pollute' the Servlet model with the Interceptors API"

    Read the article

  • ??Recovery Manager (RMAN)??

    - by ??
    Normal 0 7.8 ? 0 2 false false false EN-US ZH-CN X-NONE DefSemiHidden="true" DefQFormat="false" DefPriority="99" LatentStyleCount="267" UnhideWhenUsed="false" QFormat="true" Name="Normal"/ UnhideWhenUsed="false" QFormat="true" Name="heading 1"/ UnhideWhenUsed="false" QFormat="true" Name="Title"/ UnhideWhenUsed="false" QFormat="true" Name="Subtitle"/ UnhideWhenUsed="false" QFormat="true" Name="Strong"/ UnhideWhenUsed="false" QFormat="true" Name="Emphasis"/ UnhideWhenUsed="false" Name="Table Grid"/ UnhideWhenUsed="false" QFormat="true" Name="No Spacing"/ UnhideWhenUsed="false" Name="Light Shading"/ UnhideWhenUsed="false" Name="Light List"/ UnhideWhenUsed="false" Name="Light Grid"/ UnhideWhenUsed="false" Name="Medium Shading 1"/ UnhideWhenUsed="false" Name="Medium Shading 2"/ UnhideWhenUsed="false" Name="Medium List 1"/ UnhideWhenUsed="false" Name="Medium List 2"/ UnhideWhenUsed="false" Name="Medium Grid 1"/ UnhideWhenUsed="false" Name="Medium Grid 2"/ UnhideWhenUsed="false" Name="Medium Grid 3"/ UnhideWhenUsed="false" Name="Dark List"/ UnhideWhenUsed="false" Name="Colorful Shading"/ UnhideWhenUsed="false" Name="Colorful List"/ UnhideWhenUsed="false" Name="Colorful Grid"/ UnhideWhenUsed="false" Name="Light Shading Accent 1"/ UnhideWhenUsed="false" Name="Light List Accent 1"/ UnhideWhenUsed="false" Name="Light Grid Accent 1"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 1"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 1"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 1"/ UnhideWhenUsed="false" QFormat="true" Name="List Paragraph"/ UnhideWhenUsed="false" QFormat="true" Name="Quote"/ UnhideWhenUsed="false" QFormat="true" Name="Intense Quote"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 1"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 1"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 1"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 1"/ UnhideWhenUsed="false" Name="Dark List Accent 1"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 1"/ UnhideWhenUsed="false" Name="Colorful List Accent 1"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 1"/ UnhideWhenUsed="false" Name="Light Shading Accent 2"/ UnhideWhenUsed="false" Name="Light List Accent 2"/ UnhideWhenUsed="false" Name="Light Grid Accent 2"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 2"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 2"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 2"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 2"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 2"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 2"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 2"/ UnhideWhenUsed="false" Name="Dark List Accent 2"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 2"/ UnhideWhenUsed="false" Name="Colorful List Accent 2"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 2"/ UnhideWhenUsed="false" Name="Light Shading Accent 3"/ UnhideWhenUsed="false" Name="Light List Accent 3"/ UnhideWhenUsed="false" Name="Light Grid Accent 3"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 3"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 3"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 3"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 3"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 3"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 3"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 3"/ UnhideWhenUsed="false" Name="Dark List Accent 3"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 3"/ UnhideWhenUsed="false" Name="Colorful List Accent 3"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 3"/ UnhideWhenUsed="false" Name="Light Shading Accent 4"/ UnhideWhenUsed="false" Name="Light List Accent 4"/ UnhideWhenUsed="false" Name="Light Grid Accent 4"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 4"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 4"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 4"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 4"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 4"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 4"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 4"/ UnhideWhenUsed="false" Name="Dark List Accent 4"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 4"/ UnhideWhenUsed="false" Name="Colorful List Accent 4"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 4"/ UnhideWhenUsed="false" Name="Light Shading Accent 5"/ UnhideWhenUsed="false" Name="Light List Accent 5"/ UnhideWhenUsed="false" Name="Light Grid Accent 5"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 5"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 5"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 5"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 5"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 5"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 5"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 5"/ UnhideWhenUsed="false" Name="Dark List Accent 5"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 5"/ UnhideWhenUsed="false" Name="Colorful List Accent 5"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 5"/ UnhideWhenUsed="false" Name="Light Shading Accent 6"/ UnhideWhenUsed="false" Name="Light List Accent 6"/ UnhideWhenUsed="false" Name="Light Grid Accent 6"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 6"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 6"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 6"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 6"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 6"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 6"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 6"/ UnhideWhenUsed="false" Name="Dark List Accent 6"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 6"/ UnhideWhenUsed="false" Name="Colorful List Accent 6"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 6"/ UnhideWhenUsed="false" QFormat="true" Name="Subtle Emphasis"/ UnhideWhenUsed="false" QFormat="true" Name="Intense Emphasis"/ UnhideWhenUsed="false" QFormat="true" Name="Subtle Reference"/ UnhideWhenUsed="false" QFormat="true" Name="Intense Reference"/ UnhideWhenUsed="false" QFormat="true" Name="Book Title"/ /* Style Definitions */ table.MsoNormalTable {mso-style-name:????; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.5pt; mso-bidi-font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-font-kerning:1.0pt;} ????,RMAN?Oracle?????????????????????,?RMAN?????????????????,?????????????? ?????,RMAN????? RMAN???????????? RMAN???/?????????? ???????????? ????1487262.1???????????,?????RMAN??/????????4??????: ???? ???? ???? ??????? ???????????????,???????SQLPLUS???,???????????“DD-MON-RR HH24:MI:SS”???????,??????????? ??: SQL>spool monitor.out SQL>@monitor '06-aug-12 16:38:03' ?????RMAN???????????,?????????,?????? - ???????????????????,?????RMAN?????? ???????4??????????????? ???? ??DBA????????V$SESSION_LONGOPS???????????????????????,?????????????????? ??: SID CH CONTEXT SOFAR TOTALWORK % Complete --- -- ------- ------ ---------- ---------- 16 t1 1 181950 1186752 15.33 148 t2 1 249722 422400 59.12 ??: - ???????????????? - ??????????????? ???? ??????RMAN?????????????????????????????????????????????????????????Oracle???????????????????????? ??: SID CH SEQ# EVENT STATE SECONDS --- -- ---- --------------------------------- ------- ------- 16 t1 5735 RMAN backup & recovery I/O WAITING 10 143 8200 SQL*Net message from client WAITING 258.83 148 t2 7941 Backup: MML create a backup piece WAITING 196.13 ?????????T2???????????????,???????T2????????????????? ??: -?CH??????RMAN???,?Rman???????????,????????? -??SEQ#?????????,???????? -?????????,??????SEQ#???? -????????????????RMAN backup & recovery I/O'??????????????IO???? ???? ???????????????????????????????,???????backup_tape_io_slaves=TRUE,?????????????????? ??: SID CH STATUS OPEN_TIME SOFAR Mb TOTMB IO_COUNT % Complete TYPE FILENAME --- -- ----------- ------------------ -------- ------ ---------- -------- ------ ------------- 19 t1 FINISHED 02-aug-12 17:28:42 567.5 567.5 569 100.00 INPUT users01.dbf 19 t1 IN PROGRESS 02-aug-12 17:28:42 3489.99 8704 3490 40.10 INPUT sh.dbf 19 t1 IN PROGRESS 02-aug-12 17:28:42 3778 15113 OUTPUT kvnhlb22_1_1 20 t2 FINISHED 02-aug-12 17:28:38 740 740 742 100.00 INPUT system01.dbf 20 t2 FINISHED 02-aug-12 17:28:38 880 880 882 100.00 INPUT sysaux01.dbf 20 t2 FINISHED 02-aug-12 17:28:38 1680 1680 1682 100.00 INPUT undotbs01.dbf 20 t2 FINISHED 02-aug-12 17:28:38 3007.25 12029 OUTPUT l0nhlb23_1_1 ??: - ???????????????,??RMAN??????IO,??????????IO??,??(Document 360443.1 RMAN Backup Performance) - ?? IO_COUNT??????????? ??????? ?????,????????, ????????????v$backup_sync_io ?????????. ??: SID CH FILENAME TYPE STATUS BSZ BFC OPEN IO_COUNT --- -- ------------ ------ ----------- ------ --- ------------------ -------- 16 t1 ksnhla1b_1_1 OUTPUT IN PROGRESS 262144 4 02-aug-12 17:12:20 1092 148 t2 ktnhla1c_1_1 OUTPUT IN PROGRESS 262144 4 02-aug-12 17:12:15 2968 ??: -?????????????????,??RMAN??????IO -?? IO_COUNT?????????? -??IO_COUNT ????, ???IO_COUNT ????????????,???????????????,????????????? ????: ??????????????? Document 1487262.1 Script to monitor RMAN Backup and Restore Operations Document 144640.1   RMAN: Monitoring Recovery Manager Jobs Document 360443.1   RMAN Backup Performance Document 740911.1   RMAN Restore Performance ??,?????My Oracle Support Database Backup and Recovery community ?Oracle?????????????/??????

    Read the article

  • ?RAC????????????

    - by Allen Gao
    Normal 0 7.8 ? 0 2 false false false EN-US ZH-CN X-NONE DefSemiHidden="true" DefQFormat="false" DefPriority="99" LatentStyleCount="267" UnhideWhenUsed="false" QFormat="true" Name="Normal"/ UnhideWhenUsed="false" QFormat="true" Name="heading 1"/ UnhideWhenUsed="false" QFormat="true" Name="Title"/ UnhideWhenUsed="false" QFormat="true" Name="Subtitle"/ UnhideWhenUsed="false" QFormat="true" Name="Strong"/ UnhideWhenUsed="false" QFormat="true" Name="Emphasis"/ UnhideWhenUsed="false" Name="Table Grid"/ UnhideWhenUsed="false" QFormat="true" Name="No Spacing"/ UnhideWhenUsed="false" Name="Light Shading"/ UnhideWhenUsed="false" Name="Light List"/ UnhideWhenUsed="false" Name="Light Grid"/ UnhideWhenUsed="false" Name="Medium Shading 1"/ UnhideWhenUsed="false" Name="Medium Shading 2"/ UnhideWhenUsed="false" Name="Medium List 1"/ UnhideWhenUsed="false" Name="Medium List 2"/ UnhideWhenUsed="false" Name="Medium Grid 1"/ UnhideWhenUsed="false" Name="Medium Grid 2"/ UnhideWhenUsed="false" Name="Medium Grid 3"/ UnhideWhenUsed="false" Name="Dark List"/ UnhideWhenUsed="false" Name="Colorful Shading"/ UnhideWhenUsed="false" Name="Colorful List"/ UnhideWhenUsed="false" Name="Colorful Grid"/ UnhideWhenUsed="false" Name="Light Shading Accent 1"/ UnhideWhenUsed="false" Name="Light List Accent 1"/ UnhideWhenUsed="false" Name="Light Grid Accent 1"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 1"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 1"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 1"/ UnhideWhenUsed="false" QFormat="true" Name="List Paragraph"/ UnhideWhenUsed="false" QFormat="true" Name="Quote"/ UnhideWhenUsed="false" QFormat="true" Name="Intense Quote"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 1"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 1"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 1"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 1"/ UnhideWhenUsed="false" Name="Dark List Accent 1"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 1"/ UnhideWhenUsed="false" Name="Colorful List Accent 1"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 1"/ UnhideWhenUsed="false" Name="Light Shading Accent 2"/ UnhideWhenUsed="false" Name="Light List Accent 2"/ UnhideWhenUsed="false" Name="Light Grid Accent 2"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 2"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 2"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 2"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 2"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 2"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 2"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 2"/ UnhideWhenUsed="false" Name="Dark List Accent 2"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 2"/ UnhideWhenUsed="false" Name="Colorful List Accent 2"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 2"/ UnhideWhenUsed="false" Name="Light Shading Accent 3"/ UnhideWhenUsed="false" Name="Light List Accent 3"/ UnhideWhenUsed="false" Name="Light Grid Accent 3"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 3"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 3"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 3"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 3"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 3"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 3"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 3"/ UnhideWhenUsed="false" Name="Dark List Accent 3"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 3"/ UnhideWhenUsed="false" Name="Colorful List Accent 3"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 3"/ UnhideWhenUsed="false" Name="Light Shading Accent 4"/ UnhideWhenUsed="false" Name="Light List Accent 4"/ UnhideWhenUsed="false" Name="Light Grid Accent 4"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 4"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 4"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 4"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 4"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 4"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 4"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 4"/ UnhideWhenUsed="false" Name="Dark List Accent 4"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 4"/ UnhideWhenUsed="false" Name="Colorful List Accent 4"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 4"/ UnhideWhenUsed="false" Name="Light Shading Accent 5"/ UnhideWhenUsed="false" Name="Light List Accent 5"/ UnhideWhenUsed="false" Name="Light Grid Accent 5"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 5"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 5"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 5"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 5"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 5"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 5"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 5"/ UnhideWhenUsed="false" Name="Dark List Accent 5"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 5"/ UnhideWhenUsed="false" Name="Colorful List Accent 5"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 5"/ UnhideWhenUsed="false" Name="Light Shading Accent 6"/ UnhideWhenUsed="false" Name="Light List Accent 6"/ UnhideWhenUsed="false" Name="Light Grid Accent 6"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 6"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 6"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 6"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 6"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 6"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 6"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 6"/ UnhideWhenUsed="false" Name="Dark List Accent 6"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 6"/ UnhideWhenUsed="false" Name="Colorful List Accent 6"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 6"/ UnhideWhenUsed="false" QFormat="true" Name="Subtle Emphasis"/ UnhideWhenUsed="false" QFormat="true" Name="Intense Emphasis"/ UnhideWhenUsed="false" QFormat="true" Name="Subtle Reference"/ UnhideWhenUsed="false" QFormat="true" Name="Intense Reference"/ UnhideWhenUsed="false" QFormat="true" Name="Book Title"/ /* Style Definitions */ table.MsoNormalTable {mso-style-name:????; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.5pt; mso-bidi-font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi; mso-font-kerning:1.0pt;} ????????????????????????????????????????,??????????????Oracle RAC?????????????????????????????,???????????????????,??????RAC???????????,????????????????????????????????????,????3???RAC????????????? ????????MOS ??"Top 11 Things to do NOW to Stabilize your RAC Cluster Environment”(DOC ID 1344678.1)???,???,??????3???????????,?????????????????????????????,???,?????????????????,??,??????????????,??????????????????????,?????????????,??????????????????????,???????RAC DBA???? ??????? (PSU)??,?????????PSU? ???????????,???????Oracle???????????(PSU)???PSU?????????????????,??PSU???????????????????PSU????????,???????????????PSU,????????6????????????????????BUG????,??????????,?????????????????????,???9???,???RAC???(Cluster)BUG,??7%??BUG??????,??????????BUG??????????????????????PSU??????RAC???,PSU????????: PSU?????Grid Infrastructure(GI)home,???????????RDBMS home???????,??GI home????PSU,?????home?????,??????????GI????????????,??????,??RDBMS PSU,GI PSU??????????GI home??????PSU,???????RDBMS??PSU? RAC????PSU????rolling????? –?????????GI? RDBMS?????????????????,??PSU???????,???????????????? ???????????PSU,????????????,?????????PSU????,???RAC?????????????PSU???,???????????????????? ??PSU?????, ??????MOS??: NOTE 854428.1   Intro to Patch Set Updates (PSU) NOTE 1082394.1 11.2.0.X Grid Infrastructure PSU Known Issues NOTE 756671.1   Oracle Recommended Patches -- Oracle Database NOTE 161549.1   Oracle Database, Networking and Grid Agent Patches for Microsoft Platforms NOTE 810394.1   RAC and Oracle Clusterware Best Practices and Starter Kit 11gR2???????,?Diagwait???13? ?2012?,??45%????????11gR2???????,????diagwait?13????RAC???????????,????diagwait??????????????,????????????????, diagwait??RAC?????????????: ?????,??????OPROCD?????1??0.5?????,????,??OPROCD??? 1.5????,?????????diagwait????13??OPROCD??????????10?( diagwait - CSS????[???3?]),????????OPROCD???????????????'?'?????????????,1.5??????????????????OPROCD?????????????11?(1?????+10????)? ?????/???????,??diagwait,??????????????????????,??,????????????? ?11g?2?(11.2.0.1?????)??,?????????????,???????,??????????????????,????????????????,?????????????????????diagwait????????,????????????????????,????????Oracle?????(OCR),?????????OCR???????????,?????????diagwai?????????????????: # $CLUSTERWARE_HOME\bin\crsctl get css diagwait ????DIAGWAIT???,??????MOS??: NOTE 567730.1  Changes in Oracle Clusterware on Linux with the 10.2.0.4 Patchset NOTE 559365.1  Using Diagwait as a diagnostic to get more information for diagnosing Oracle Clusterware Node evictions NOTE 810394.1 RAC and Oracle Clusterware Best Practices and Starter Kit ??OS Watcher Black Box(OSWbb) ? Cluster Health Monitor(CHM) ????????OS??????????????,??,??????OS Watcher Black Box(OSWbb)(??OS Watcher)?Cluster Health Monitor(CHM)????????OS???,??DBA????????????????????????????,?????????????,??????????,?????????????????????????OS????????,????????????,???????????????????? OSWbb?????????,??????,????OS??????????????,????OS??????OSWbb???????: ?????,??30??????????OS?????????????(??5??)????????????????????,?1???5????????????????????????30????????,Oracle???????????????OS?????????????,Oracle??????OSWbb?20???????? OSWbb?????????????????Oracle???????????????????OS????,??,?????????????????????????Oracle???????,?????????????,????????????????? ???11.2.0.3??,??????(HP-UX??)?,Oracle GI?????????,Cluster Health Monitor (CHM)?CHM??????,?????OSW????,??,???????OSW????,?????????? Oracle??????????????????OSWbb?/?CHM,?????????,????????????????????,??????????OSWbb,???????????RAC??,??????????????????(???NOTE 580513.1“How To Start OS Watcher Black Box Every System Boot”??????)? ??OSWbb?CHM?????, ??????MOS??: NOTE 301137.1   OS Watcher Black Box User Guide NOTE 1328466.1 Cluster Health Monitor (CHM) FAQ NOTE 810394.1   RAC and Oracle Clusterware Best Practices and Starter Kit ?? ?????????RAC/ Oracle?????????????3???????????3?,?????RAC??????,?????????????????,?????MOS??: NOTE 1344678.1 Top 11 Things to do NOW to Stabilize your RAC Cluster Environment ????,???MOS-RAC/Scalability community??,?Oracle???????????,????RAC/ Oracle?????

    Read the article

  • Flex - Typed ArrayCollection as Horizontallist's dataprovider

    - by BS_C3
    Hi community! I have an ArrayCollection of objects. I'm passing this array to a horizontallist as a dataprovider and I'm using a custom itemRenderer. When executing the application, the horizontallist is displaying [object CustomClass][object CustomClass][object CustomClass][object CustomClass] I've tried casting each object in the itemrenderer as following: <mx:Label text="{(data as CustomClass).label1}"/> But it's not working... Thanks for any help you can provide. Regards, BS_C3 Edit - 09 March 2010 Let's go for some more code =) <?xml version="1.0" encoding="utf-8"?> <mx:Canvas xmlns:mx="http://www.adobe.com/2006/mxml"> <mx:Component id="Item"> <mx:VBox width="180"> <mx:HBox width="100%"> <mx:Spacer width="100%"/> <mx:Button label="x"/> </mx:HBox> <mx:Image id="thumbnail"/> <mx:Label width="100%" horizontalCenter="0" text="Collection"/> <mx:HBox width="100%"> <mx:Label width="100" text="GIA"/> <mx:Label text="{data.charg_st}"/> </mx:HBox> <mx:HBox width="100%"> <mx:Label width="100" text="Finger Size"/> <mx:Label text="xxxxxx"/> </mx:HBox> <mx:HBox width="100%"> <mx:Label width="100" text="Carat"/> <mx:Label text="{data.carats}"/> </mx:HBox> <mx:HBox width="100%"> <mx:Label width="100" text="Color"/> <mx:Label text="{data.color}"/> </mx:HBox> <mx:HBox width="100%"> <mx:Label width="100" text="Clarity"/> <mx:Label text="{data.clarity}"/> </mx:HBox> <mx:HBox width="100%"> <mx:Label width="100" text="Shop"/> <mx:Label text="{data.lgort_fp}"/> </mx:HBox> <mx:HBox width="100%"> <mx:Label width="100" text="Resizing"/> <mx:Label text="{data.resizing}"/> </mx:HBox> <mx:HBox width="100%"> <mx:Label width="100" text="Price Excl. VAT"/> <mx:Label text="{data.net_price_fp}"/> </mx:HBox> </mx:VBox> </mx:Component> <mx:HorizontalList dataProvider="{GlobalData.instance.tray}" columnCount="4" rowCount="1" horizontalScrollPolicy="off" itemRenderer="{Item}" /> </mx:Canvas> FYI, the horizonalList dataprovider is an ArrayCollection of objects. Now, the horizontallist is displaying empty items... with the correct width... The arraycollection is not empty (I'm using an alert on the click event on an item, and I do retrieve the expected data). Hope this will help _< Regards, BS_C3

    Read the article

  • JqGrid addJSONData + ASP.NET 2.0 WS

    - by MilosC
    Dear community ! I am a bit lost. I' ve tried to implement a solution based on JqGrid and tried to use function as datatype. I've setted all by the book i guess, i get WS invoked and get JASON back, I got succes on clientside in ajaf call and i "bind" jqGrid using addJSONData but grid remains empty. I do not have any glue now... other "local" samples on same pages works without a problem (jsonstring ...) My WS method looks like : [WebMethod] [ScriptMethod(ResponseFormat = ResponseFormat.Json)] public string GetGridData() { // Load a list InitSessionVariables(); SA.DB.DenarnaEnota.DenarnaEnotaDB db = new SAOP.SA.DB.DenarnaEnota.DenarnaEnotaDB(); DataSet ds = db.GetLookupForDenarnaEnota(SAOP.FW.DB.RecordStatus.All); // Turn into HTML friendly format GetGridData summaryList = new GetGridData(); summaryList.page = "1"; summaryList.total = "10"; summaryList.records = "160"; int i = 0; foreach (DataRow dr in ds.Tables[0].Rows) { GridRows row = new GridRows(); row.id = dr["DenarnaEnotaID"].ToString(); row.cell = "[" + "\"" + dr["DenarnaEnotaID"].ToString() + "\"" + "," + "\"" + dr["Kratica"].ToString() + "\"" + "," + "\"" + dr["Naziv"].ToString() + "\"" + "," + "\"" + dr["Sifra"].ToString() + "\"" + "]"; summaryList.rows.Add(row); } return JsonConvert.SerializeObject(summaryList); } my ASCX code is this: jQuery(document).ready(function(){ jQuery("#list").jqGrid({ datatype : function (postdata) { jQuery.ajax({ url:'../../AjaxWS/TemeljnicaEdit.asmx/GetGridData', data:'{}', dataType:'json', type: 'POST', contentType: "application/json; charset=utf-8", complete: function(jsondata,stat){ if(stat=="success") { var clearJson = jsondata.responseText; var thegrid = jQuery("#list")[0]; var myjsongrid = eval('('+clearJson+')'); alfs thegrid.addJSONData(myjsongrid.replace(/\\/g,'')); } } } ); }, colNames:['DenarnaEnotaID','Kratica', 'Sifra', 'Naziv'], colModel:[ {name:'DenarnaEnotaID',index:'DenarnaEnotaID', width:100}, {name:'Kratica',index:'Kratica', width:100}, {name:'Sifra',index:'Sifra', width:100}, {name:'Naziv',index:'Naziv', width:100}], rowNum:15, rowList:[15,30,100], pager: jQuery('#pager'), sortname: 'id', // loadtext:"Nalagam zapise...", // viewrecords: true, sortorder: "desc", // caption:"Vrstice", // width:"800", imgpath: "../Scripts/JGrid/themes/basic/images"}); }); from WS i GET JSON like this: {”page”:”1?,”total”:”10?,”records”:”160?,”rows”:[{"id":"18","cell":"["18","BAM","Konvertibilna marka","977"]“},{”id”:”19?,”cell”:”["19","RSD","Srbski dinar","941"]“},{”id”:”20?,”cell”:”["20","AFN","Afgani","971"]“},{”id”:”21?,”cell”:”["21","ALL","Lek","008"]“},{”id”:”22?,”cell”:”["22","DZD","Alžirski dinar","012"]“},{”id”:”23?,”cell”:”["23","AOA","Kvanza","973"]“},{”id”:”24?,”cell”:”["24","XCD","Vzhodnokaribski dolar","951"]“},{”id”:”25?,”cell”:” ……………… ["13","PLN","Poljski zlot","985"]“},{”id”:”14?,”cell”:”["14","SEK","Švedska krona","752"]“},{”id”:”15?,”cell”:”["15","SKK","Slovaška krona","703"]“},{”id”:”16?,”cell”:”["16","USD","Ameriški dolar","840"]“},{”id”:”17?,”cell”:”["17","XXX","Nobena valuta","000"]“},{”id”:”1?,”cell”:”["1","SIT","Slovenski tolar","705"]“}]} i have registered this js : clientSideScripts.RegisterClientScriptFile("prototype.js", CommonFunctions.FixupUrlWithoutSessionID("~/WebUI/Scripts/prototype-1.6.0.2.js")); clientSideScripts.RegisterClientScriptFile("jquery.js", CommonFunctions.FixupUrlWithoutSessionID("~/WebUI/Scripts/JGrid/jquery.js")); clientSideScripts.RegisterClientScriptFile("jquery.jqGrid.js", CommonFunctions.FixupUrlWithoutSessionID("~/WebUI/Scripts/JGrid/jquery.jqGrid.js")); clientSideScripts.RegisterClientScriptFile("jqModal.js", CommonFunctions.FixupUrlWithoutSessionID("~/WebUI/Scripts/JGrid/js/jqModal.js")); clientSideScripts.RegisterClientScriptFile("jqDnR.js", CommonFunctions.FixupUrlWithoutSessionID("~/WebUI/Scripts/JGrid/js/jqDnR.js")); Basical i think it must be something stupid ...but i can figure it out now... Help wanted.

    Read the article

  • OpenVPN Error : TLS Error: local/remote TLS keys are out of sync: [AF_INET]

    - by Lucidity
    Fist off thanks for reading this, I appreciate any and all suggestions. I am having some serious problems reconnecting to my OpenVPN client using Riseup.net's VPN. I have spent a few days banging my head against the wall in attempts to set this up on my iOS devices....but that is a whole other issue. I was however able to set it up on my Mac OS X specifically on my Windows Vista 32 bit BootCamp VM with relatively little trouble. To originally connect I only had to modify the recommended Config file very slightly (Config file included at the end of this post): - I had to enter the code directly into my config file - And change "dev tap" to "dev tun" So I was connected. (Note - I did test to ensure the VPN was actually working after I originally connected, it was. Also verified the .pem file (inserted as the coding in my config file) for authenticity). I left the VPN running. My computer went to sleep. Today I went to use the internet expecting (possibly incorrectly - I am now unsure if I was wrong to leave it running) to still be connected to the VPN. However I saw immediately I was not. I went to reconnect. And was (am) unable to. My logs after attempting to connect (and getting a connection failed dialog box) show everything working as it should (as far as I can tell) until the end where I get the following lines: Mon Sep 23 21:07:49 2013 us=276809 Initialization Sequence Completed Mon Sep 23 21:07:49 2013 us=276809 MANAGEMENT: >STATE:1379995669,CONNECTED,SUCCESS, OMITTED Mon Sep 23 21:22:50 2013 us=390350 Authenticate/Decrypt packet error: packet HMAC authentication failed Mon Sep 23 21:23:39 2013 us=862180 TLS Error: local/remote TLS keys are out of sync: [AF_INET] VPN IP OMITTED [2] Mon Sep 23 21:23:57 2013 us=395183 Authenticate/Decrypt packet error: packet HMAC authentication failed Mon Sep 23 22:07:41 2013 us=296898 TLS: soft reset sec=0 bytes=513834601/0 pkts=708032/0 Mon Sep 23 22:07:41 2013 us=671299 VERIFY OK: depth=1, C=US, O=Riseup Networks, L=Seattle, ST=WA, CN=Riseup Networks, [email protected] Mon Sep 23 22:07:41 2013 us=671299 VERIFY OK: depth=0, C=US, O=Riseup Networks, L=Seattle, ST=WA, CN=vpn.riseup.net Mon Sep 23 22:07:46 2013 us=772508 Data Channel Encrypt: Cipher 'BF-CBC' initialized with 128 bit key Mon Sep 23 22:07:46 2013 us=772508 Data Channel Encrypt: Using 160 bit message hash 'SHA1' for HMAC authentication Mon Sep 23 22:07:46 2013 us=772508 Data Channel Decrypt: Cipher 'BF-CBC' initialized with 128 bit key Mon Sep 23 22:07:46 2013 us=772508 Data Channel Decrypt: Using 160 bit message hash 'SHA1' for HMAC authentication Mon Sep 23 22:07:46 2013 us=772508 Control Channel: TLSv1, cipher TLSv1/SSLv3 DHE-RSA-AES256-SHA, 2048 bit RSA So I have searched for a solution online and I have included what I have attempted below, however I fear (know) I am not knowledgeable enough in this area to fix this myself. I apologize in advance for my ignorance. I do tech support for a living, but not this kind of tech support unfortunately. Other notes and troubleshooting done - - Windows Firewall is disabled completely, as well as other Anti-virus programs - Tor is disabled completely - No Proxies running - Time is correct in all locations - Router Firmware is up to date - Able to connect to the internet and as far as I can tell all necessary ports are open. - No settings have been altered since I was able to connect successfully. - Ethernet as well as wifi connections attempted, resulted in same error. Also tried adding the following lines to my config file (without success or change in error): persist-key persist-tun proto tcp (after reading that this error generally occurs on UDP connections, and is extremely rare on TCP) resolv-retry infinite (thinking the connection may have timed out since the issues occurred after leaving VPN connected during about 10 hrs of computer in sleep mode) All attempts resulted in exact same error code included at the top of this post. The original suggestions I found online stated - (regarding the TLS Error) - This error should resolve itself within 60 seconds, or if not quit wait 120 seconds and try again. (Which isnt the case here...) (regarding the Out of Sync" error) - If you continue to get "out of sync" errors and the link does not come up, then it means that something is probably wrong with your config file. You must use either ping and ping-restart on both sides of the connection, or keepalive on the server side of a client/server connection, in order to gracefully recover from "local/remote TLS keys are out of sync" errors. I wouldn't be surprised if my config file is lacking, or not correct. However I can confirm I followed the instructions to a tee. And was able to connect originally (and have not modified my settings or config file since I was able to connect to when the error began occurring). I have a very simple config file: client dev tun tun-mtu 1500 remote vpn.riseup.net auth-user-pass ca RiseupCA.pem redirect-gateway verb 4 <ca> -----BEGIN CERTIFICATE----- [OMITTED] -----END CERTIFICATE----- </ca> I would really appreciate any help or suggestions. I am at a total loss here, I know I'm asking a lot here. Though I am a new user on this site I help others on many forums including Microsoft's support community and especially Apple's support communities, so I will definitely pass on anything I learn here to help others. Thanks so so so much in advance for reading this.

    Read the article

  • Git for beginners: The definitive practical guide

    - by Adam Davis
    Ok, after seeing this post by PJ Hyett, I have decided to skip to the end and go with git. So what I need is a beginners practical guide to git. "Beginner" being defined as someone who knows how to handle their compiler, understands to some level what a makefile is, and has touched source control without understanding it very well. "Practical" being defined as this person doesn't want to get into great detail regarding what git is doing in the background, and doesn't even care (or know) that it's distributed. Your answers might hint at the possibilities, but try to aim for the beginner that wants to keep a 'main' repository on a 'server' which is backed up and secure, and treat their local repository as merely a 'client' resource. Procedural note: PLEASE pick one and only one of the below topics and answer it clearly and concisely in any given answer. Don't try to jam a bunch of information into one answer. Don't just link to other resources - cut and paste with attribution if copyright allows, otherwise learn it and explain it in your own words (ie, don't make people leave this page to learn a task). Please comment on, or edit, an already existing answer unless your explanation is very different and you think the community is better served with a different explanation rather than altering the existing explanation. So: Installation/Setup How to install git How do you set up git? Try to cover linux, windows, mac, think 'client/server' mindset. Setup GIT Server with Msysgit on Windows How do you create a new project/repository? How do you configure it to ignore files (.obj, .user, etc) that are not really part of the codebase? Working with the code How do you get the latest code? How do you check out code? How do you commit changes? How do you see what's uncommitted, or the status of your current codebase? How do you destroy unwanted commits? How do you compare two revisions of a file, or your current file and a previous revision? How do you see the history of revisions to a file? How do you handle binary files (visio docs, for instance, or compiler environments)? How do you merge files changed at the "same time"? How do you undo (revert or reset) a commit? Tagging, branching, releases, baselines How do you 'mark' 'tag' or 'release' a particular set of revisions for a particular set of files so you can always pull that one later? How do you pull a particular 'release'? How do you branch? How do you merge branches? How do you resolve conflicts and complete the merge? How do you merge parts of one branch into another branch? What is rebasing? How do I track remote branches? How can I create a branch on a remote repository? Other Describe and link to a good gui, IDE plugin, etc that makes git a non-command line resource, but please list its limitations as well as its good. msysgit - Cross platform, included with git gitk - Cross platform history viewer, included with git gitnub - OS X gitx - OS X history viewer smartgit - Cross platform, commercial, beta tig - console GUI for Linux qgit - GUI for Windows, Linux Any other common tasks a beginner should know? Git Status tells you what you just did, what branch you have, and other useful information How do I work effectively with a subversion repository set as my source control source? Other git beginner's references git guide git book git magic gitcasts github guides git tutorial Progit - book by Scott Chacon Git - SVN Crash Course Delving into git Understanding git conceptually I will go through the entries from time to time and 'tidy' them up so they have a consistent look/feel and it's easy to scan the list - feel free to follow a simple "header - brief explanation - list of instructions - gotchas and extra info" template. I'll also link to the entries from the bullet list above so it's easy to find them later.

    Read the article

  • Update packages on very old ubuntu

    - by meewoK
    I want to add Mysqli support to a machine running: Server Version: Apache/2.2.4 (Ubuntu) PHP/5.2.3-1ubuntu6.3 I would rather not update more things then I need to. I run the following: sudo apt-get install php5-mysql However, as the ubuntu version is old I get the following. WARNING: The following packages cannot be authenticated! php5-cli php5-mysql php5-mhash php5-xsl php5-pspell php5-snmp php5-curl php5-xmlrpc php5-sqlite php5-gd libapache2-mod-php5 php5-common Install these packages without verification [y/N]? Y Err http://gr.archive.ubuntu.com gutsy-updates/main php5-cli 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-cli 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-mysql 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-mhash 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-xsl 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-pspell 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-snmp 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-curl 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-xmlrpc 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-sqlite 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-gd 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main libapache2-mod-php5 5.2.3-1ubuntu6.4 404 Not Found Err http://security.ubuntu.com gutsy-security/main php5-common 5.2.3-1ubuntu6.4 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-cli_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-mysql_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-mhash_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-xsl_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-pspell_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-snmp_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-curl_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-xmlrpc_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-sqlite_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-gd_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/libapache2-mod-php5_5.2.3-1ubuntu6.4_i386.deb 404 Not Found Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/p/php5/php5-common_5.2.3-1ubuntu6.4_i386.deb 404 Not Found E: Unable to fetch some archives, maybe run apt-get update or try with --fix-missing? Questions Can I add mysqli feature using another method instead of sudo-apt get? Even if successful can this break something on the system? Update: I have tried to add additional sources using the instructions from: http://superuser.com/questions/339537/where-can-i-get-therepositories-for-old-ubuntu-versions I have the following in the /etc/apt/sources.list file: # deb cdrom:[Ubuntu-Server 7.10 _Gutsy Gibbon_ - Release i386 (20071016)]/ gutsy main restricted #deb cdrom:[Ubuntu-Server 7.10 _Gutsy Gibbon_ - Release i386 (20071016)]/ gutsy main restricted # See http://help.ubuntu.com/community/UpgradeNotes for how to upgrade to # newer versions of the distribution. deb http://gr.archive.ubuntu.com/ubuntu/ gutsy main restricted universe multiverse deb http://gr.archive.ubuntu.com/ubuntu/ gutsy-backports main restricted universe multiverse deb-src http://gr.archive.ubuntu.com/ubuntu/ gutsy main restricted ## Major bug fix updates produced after the final release of the ## distribution. deb http://gr.archive.ubuntu.com/ubuntu/ gutsy-updates main restricted deb-src http://gr.archive.ubuntu.com/ubuntu/ gutsy-updates main restricted ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## universe WILL NOT receive any review or updates from the Ubuntu security ## team. deb http://gr.archive.ubuntu.com/ubuntu/ gutsy-updates universe deb-src http://gr.archive.ubuntu.com/ubuntu/ gutsy-updates universe ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## multiverse WILL NOT receive any review or updates from the Ubuntu ## security team. #deb http://gr.archive.ubuntu.com/ubuntu/ gutsy multiverse deb-src http://gr.archive.ubuntu.com/ubuntu/ gutsy multiverse deb http://gr.archive.ubuntu.com/ubuntu/ gutsy-updates multiverse deb-src http://gr.archive.ubuntu.com/ubuntu/ gutsy-updates multiverse ## Uncomment the following two lines to add software from the 'backports' ## repository. ## N.B. software from this repository may not have been tested as ## extensively as that contained in the main release, although it includes ## newer versions of some applications which may provide useful features. ## Also, please note that software in backports WILL NOT receive any review ## or updates from the Ubuntu security team. # deb http://gr.archive.ubuntu.com/ubuntu/ gutsy-backports main restricted universe multiverse # deb-src http://gr.archive.ubuntu.com/ubuntu/ gutsy-backports main restricted universe multiverse ## Uncomment the following two lines to add software from Canonical's ## 'partner' repository. This software is not part of Ubuntu, but is ## offered by Canonical and the respective vendors as a service to Ubuntu ## users. # deb http://archive.canonical.com/ubuntu gutsy partner # deb-src http://archive.canonical.com/ubuntu gutsy partner deb http://security.ubuntu.com/ubuntu gutsy-security main restricted deb-src http://security.ubuntu.com/ubuntu gutsy-security main restricted deb http://security.ubuntu.com/ubuntu gutsy-security universe deb-src http://security.ubuntu.com/ubuntu gutsy-security universe deb http://security.ubuntu.com/ubuntu gutsy-security multiverse deb-src http://security.ubuntu.com/ubuntu gutsy-security multiverse # Required deb http://old-releases.ubuntu.com/ubuntu/gutsy main restricted universe multiverse deb http://old-releases.ubuntu.com/ubuntu/gutsy-updates main restricted universe multiverse deb http://old-releases.ubuntu.com/ubuntu/gutsy-security main restricted universe multiverse

    Read the article

  • How to set up spf records to send mail from google hosted apps to gmail addresses

    - by Chris Adams
    Hi there, I'm trying to work out why email I send from one domain I own is rejected by another that I own, and while I think it may be related to how I've setup spf records, I'm not sure what steps I need to take to fix it. Here's the error message I receive: Technical details of permanent failure: Google tried to deliver your message, but it was rejected by the recipient domain. We recommend contacting the other email provider for further information about the cause of this error. The error that the other server returned was: 550 550-Verification failed for <[email protected]> 550-No Such User Here 550 Sender verify failed (state 14). Here's the response from [email protected] Delivered-To: [email protected] Received: by 10.86.92.9 with SMTP id p9cs85371fgb; Wed, 2 Sep 2009 22:33:32 -0700 (PDT) Received: by 10.90.205.4 with SMTP id c4mr2406190agg.29.1251956007562; Wed, 02 Sep 2009 22:33:27 -0700 (PDT) Return-Path: <[email protected]> Received: from verifier.port25.com (207-36-201-235.ptr.primarydns.com [207.36.201.235]) by mx.google.com with ESMTP id 26si831174aga.24.2009.09.02.22.33.25; Wed, 02 Sep 2009 22:33:26 -0700 (PDT) Received-SPF: pass (google.com: domain of [email protected] designates 207.36.201.235 as permitted sender) client-ip=207.36.201.235; Authentication-Results: mx.google.com; spf=pass (google.com: domain of [email protected] designates 207.36.201.235 as permitted sender) [email protected]; dkim=pass [email protected] DKIM-Signature: v=1; a=rsa-sha1; c=relaxed/relaxed; s=auth; d=port25.com; h=Date:From:To:Subject:Message-Id:In-Reply-To; [email protected]; bh=GRMrcnoucTl4upzqJYTG5sOZMLU=; b=uk6TjADEyZVRkceQGjH94ZzfVeRTsiZPzbXuhlqDt1m+kh1zmdUEoiTOzd89ryCHMbVcnG1JajBj 5vOMKYtA3g== DomainKey-Signature: a=rsa-sha1; c=nofws; q=dns; s=auth; d=port25.com; b=NqKCPK00Xt49lbeO009xy4ZRgMGpghvcgfhjNy7+qI89XKTzi6IUW0hYqCQyHkd2p5a1Zjez2ZMC l0u9CpZD3Q==; Received: from verifier.port25.com (127.0.0.1) by verifier.port25.com (PowerMTA(TM) v3.6a1) id hjt9pq0hse8u for <[email protected]>; Thu, 3 Sep 2009 01:26:52 -0400 (envelope-from <[email protected]>) Date: Thu, 3 Sep 2009 01:26:52 -0400 From: [email protected] To: [email protected] Subject: Authentication Report Message-Id: <[email protected]> Precedence: junk (auto_reply) In-Reply-To: <[email protected]> This message is an automatic response from Port25's authentication verifier service at verifier.port25.com. The service allows email senders to perform a simple check of various sender authentication mechanisms. It is provided free of charge, in the hope that it is useful to the email community. While it is not officially supported, we welcome any feedback you may have at <[email protected]>. Thank you for using the verifier, The Port25 Solutions, Inc. team ========================================================== Summary of Results ========================================================== SPF check: pass DomainKeys check: neutral DKIM check: neutral Sender-ID check: pass SpamAssassin check: ham ========================================================== Details: ========================================================== HELO hostname: fg-out-1718.google.com Source IP: 72.14.220.158 mail-from: [email protected] ---------------------------------------------------------- SPF check details: ---------------------------------------------------------- Result: pass ID(s) verified: [email protected] DNS record(s): stemcel.co.uk. 14400 IN TXT "v=spf1 include:aspmx.googlemail.com ~all" aspmx.googlemail.com. 7200 IN TXT "v=spf1 redirect=_spf.google.com" _spf.google.com. 300 IN TXT "v=spf1 ip4:216.239.32.0/19 ip4:64.233.160.0/19 ip4:66.249.80.0/20 ip4:72.14.192.0/18 ip4:209.85.128.0/17 ip4:66.102.0.0/20 ip4:74.125.0.0/16 ip4:64.18.0.0/20 ip4:207.126.144.0/20 ?all" ---------------------------------------------------------- DomainKeys check details: ---------------------------------------------------------- Result: neutral (message not signed) ID(s) verified: [email protected] DNS record(s): ---------------------------------------------------------- DKIM check details: ---------------------------------------------------------- Result: neutral (message not signed) ID(s) verified: NOTE: DKIM checking has been performed based on the latest DKIM specs (RFC 4871 or draft-ietf-dkim-base-10) and verification may fail for older versions. If you are using Port25's PowerMTA, you need to use version 3.2r11 or later to get a compatible version of DKIM. ---------------------------------------------------------- Sender-ID check details: ---------------------------------------------------------- Result: pass ID(s) verified: [email protected] DNS record(s): stemcel.co.uk. 14400 IN TXT "v=spf1 include:aspmx.googlemail.com ~all" aspmx.googlemail.com. 7200 IN TXT "v=spf1 redirect=_spf.google.com" _spf.google.com. 300 IN TXT "v=spf1 ip4:216.239.32.0/19 ip4:64.233.160.0/19 ip4:66.249.80.0/20 ip4:72.14.192.0/18 ip4:209.85.128.0/17 ip4:66.102.0.0/20 ip4:74.125.0.0/16 ip4:64.18.0.0/20 ip4:207.126.144.0/20 ?all" ---------------------------------------------------------- SpamAssassin check details: ---------------------------------------------------------- SpamAssassin v3.2.5 (2008-06-10) Result: ham (-2.6 points, 5.0 required) pts rule name description ---- ---------------------- -------------------------------------------------- -0.0 SPF_PASS SPF: sender matches SPF record -2.6 BAYES_00 BODY: Bayesian spam probability is 0 to 1% [score: 0.0000] 0.0 HTML_MESSAGE BODY: HTML included in message I've registered the spf records for my domain, as advised here Both domains pass validate according to Kitterman's spf record testing tools, so I'm somewhat confused about this. I also have the catchall address set up on the stemcel.co.uk domain here, but I don't have one setup for chrisadams.me.uk. Instead, we have the following forwarders setup [email protected] to [email protected] [email protected] to [email protected] [email protected] to [email protected] [email protected] to [email protected] Any ideas how to get this working? I'm not sure what I should be looking for here.

    Read the article

  • Migrate from MySQL to PostgreSQL on Linux (Kubuntu)

    - by Dave Jarvis
    A long time ago in a galaxy far, far away... Trying to migrate a database from MySQL to PostgreSQL. All the documentation I have read covers, in great detail, how to migrate the structure. I have found very little documentation on migrating the data. The schema has 13 tables (which have been migrated successfully) and 9 GB of data. MySQL version: 5.1.x PostgreSQL version: 8.4.x I want to use the R programming language to analyze the data using SQL select statements; PostgreSQL has PL/R, but MySQL has nothing (as far as I can tell). A New Hope Create the database location (/var has insufficient space; also dislike having the PostgreSQL version number everywhere -- upgrading would break scripts!): sudo mkdir -p /home/postgres/main sudo cp -Rp /var/lib/postgresql/8.4/main /home/postgres sudo chown -R postgres.postgres /home/postgres sudo chmod -R 700 /home/postgres sudo usermod -d /home/postgres/ postgres All good to here. Next, restart the server and configure the database using these installation instructions: sudo apt-get install postgresql pgadmin3 sudo /etc/init.d/postgresql-8.4 stop sudo vi /etc/postgresql/8.4/main/postgresql.conf Change data_directory to /home/postgres/main sudo /etc/init.d/postgresql-8.4 start sudo -u postgres psql postgres \password postgres sudo -u postgres createdb climate pgadmin3 Use pgadmin3 to configure the database and create a schema. The episode continues in a remote shell known as bash, with both databases running, and the installation of a set of tools with a rather unusual logo: SQL Fairy. perl Makefile.PL sudo make install sudo apt-get install perl-doc (strangely, it is not called perldoc) perldoc SQL::Translator::Manual Extract a PostgreSQL-friendly DDL and all the MySQL data: sqlt -f DBI --dsn dbi:mysql:climate --db-user user --db-password password -t PostgreSQL > climate-pg-ddl.sql mysqldump --skip-add-locks --complete-insert --no-create-db --no-create-info --quick --result-file="climate-my.sql" --databases climate --skip-comments -u root -p The Database Strikes Back Recreate the structure in PostgreSQL as follows: pgadmin3 (switch to it) Click the Execute arbitrary SQL queries icon Open climate-pg-ddl.sql Search for TABLE " replace with TABLE climate." (insert the schema name climate) Search for on " replace with on climate." (insert the schema name climate) Press F5 to execute This results in: Query returned successfully with no result in 122 ms. Replies of the Jedi At this point I am stumped. Where do I go from here (what are the steps) to convert climate-my.sql to climate-pg.sql so that they can be executed against PostgreSQL? How to I make sure the indexes are copied over correctly (to maintain referential integrity; I don't have constraints at the moment to ease the transition)? How do I ensure that adding new rows in PostgreSQL will start enumerating from the index of the last row inserted (and not conflict with an existing primary key from the sequence)? How do you ensure the schema name comes through when transforming the data from MySQL to PostgreSQL inserts? Resources A fair bit of information was needed to get this far: https://help.ubuntu.com/community/PostgreSQL http://articles.sitepoint.com/article/site-mysql-postgresql-1 http://wiki.postgresql.org/wiki/Converting_from_other_Databases_to_PostgreSQL#MySQL http://pgfoundry.org/frs/shownotes.php?release_id=810 http://sqlfairy.sourceforge.net/ Thank you!

    Read the article

  • Add Service Reference is generating Message Contracts

    - by JohnIdol
    OK, this has been haunting me for a while, can't find much on Google and I am starting to lose hope so I am reverting to the SO community. When I import a given service using "Add service Reference" on Visual Studio 2008 (SP1) all the Request/Response messages are being unnecessarily wrapped into Message Contracts (named as -- "operationName" + "Request"/"Response" + "1" at the end). The code generator says: // CODEGEN: Generating message contract since the operation XXX is neither RPC nor document wrapped. The guys who are generating the wsdl from a Java service say they are specifying DOCUMENT-LITERAL/WRAPPED. Any help/pointer/clue would be highly appreciated. Update: this is a sample of my wsdl for one of the operations that look suspicious. Note the mismatch on the message element attribute for the request, compared to the response. <!- imports namespaces and defines elements --> <wsdl:types> <xsd:schema targetNamespace="http://WHATEVER/" xmlns:xsd_1="http://WHATEVER_1/" xmlns:xsd_2="http://WHATEVER_2/"> <xsd:import namespace="http://WHATEVER_1/" schemaLocation="WHATEVER_1.xsd"/> <xsd:import namespace="http://WHATEVER_2/" schemaLocation="WHATEVER_2.xsd"/> <xsd:element name="myOperationResponse" type="xsd_1:MyOperationResponse"/> <xsd:element name="myOperation" type="xsd_1:MyOperationRequest"/> </xsd:schema> </wsdl:types> <!- declares messages - NOTE the mismatch on the request element attribute compared to response --> <wsdl:message name="myOperationRequest"> <wsdl:part element="tns:myOperation" name="request"/> </wsdl:message> <wsdl:message name="myOperationResponse"> <wsdl:part element="tns:myOperationResponse" name="response"/> </wsdl:message> <!- operations --> <wsdl:portType name="MyService"> <wsdl:operation name="myOperation"> <wsdl:input message="tns:myOperationRequest"/> <wsdl:output message="tns:myOperationResponse"/> <wsdl:fault message="tns:myOperationFault" name="myOperationFault"/> <wsdl:fault message="tns:myOperationFault1" name="myOperationFault1"/> </wsdl:operation> </wsdl:portType> Update 2: I pulled all the types that I had in my imported namespace (they were in a separate xsd) into the wsdl, as I suspected the import could be triggering the message contract generation. To my surprise it was not the case and having all the types defined in the wsdl did not change anything. I then (out of desperation) started constructing wsdls from scratch and playing with the maxOccurs attributes of element attributes contained in a sequence attribute I was able to reproduce the undesired message contract generation behavior. Here's a sample of an element: <xsd:element name="myElement"> <xsd:complexType> <xsd:sequence> <xsd:element minOccurs="0" maxOccurs="1" name="arg1" type="xsd:string"/> </xsd:sequence> </xsd:complexType> </xsd:element> Playing with maxOccurs on elements that are used as messages (all requests and responses basically) the following happens: maxOccurs = "1" does not trigger the wrapping macOcccurs 1 triggers the wrapping maxOccurs = "unbounded" triggers the wrapping I was not able to reproduce this on my production wsdl yet because the nesting of the types goes very deep, and it's gonna take me time to inspect it thoroughly. In the meanwhile I am hoping it might ring a bell - any help highly appreciated.

    Read the article

  • Android: get current location from best available provider

    - by AP257
    Hi all, I have some Android code that needs to get the best available location QUICKLY, from GPS, network or whatever is available. Accuracy is less important than speed. Getting the best available location is surely a really standard task. Yet I can't find any code to demonstrate it. The Android location code expects you to specify criteria, register for updates, and wait - which is fine if you have detailed criteria and don't mind waiting around. But my app needs to work a bit more like the Maps app does when it first locates you - work from any available provider, and just check the location isn't wildly out of date or null. I've attempted to roll my own code to do this, but am having problems. (It's inside an IntentService where an upload happens, if that makes any difference. I've included all the code for info.) What's wrong with this code? @Override protected void onHandleIntent(Intent arg0) { testProviders(); doUpload(); } private boolean doUpload() { int j = 0; // check if we have accurate location data yet - wait up to 30 seconds while (j < 30) { if ((latString == "") || (lonString == "")) { Log.d(LOG_TAG, "latlng null"); Thread.sleep(1000); j++; } else { Log.d(LOG_TAG, "found lat " + latString + " and lon " + lonString); break; } //do the upload here anyway, with or without location data //[code removed for brevity] } public boolean testProviders() { Log.e(LOG_TAG, "testProviders"); String location_context = Context.LOCATION_SERVICE; locationmanager = (LocationManager) getSystemService(location_context); List<String> providers = locationmanager.getProviders(true); for (String provider : providers) { Log.e(LOG_TAG, "registering provider " + provider); listener = new LocationListener() { public void onLocationChanged(Location location) { // keep checking the location - until we have // what we need //if (!checkLoc(location)) { Log.e(LOG_TAG, "onLocationChanged"); locationDetermined = checkLoc(location); //} } public void onProviderDisabled(String provider) { } public void onProviderEnabled(String provider) { } public void onStatusChanged(String provider, int status, Bundle extras) { } }; locationmanager.requestLocationUpdates(provider, 0, 0, listener); } Log.e(LOG_TAG, "getting updates"); return true; } private boolean checkLoc(Location location) { float tempAccuracy = location.getAccuracy(); int locAccuracy = (int) tempAccuracy; Log.d(LOG_TAG, "locAccuracy = " + locAccuracy); if ((locAccuracy != 0) && (locAccuracy < LOCATION_ACCURACY)) { latitude = location.getLatitude(); longitude = location.getLongitude(); latString = latitude.toString(); lonString = longitude.toString(); return true; } return false; } public void removeListeners() { // Log.e(LOG_TAG, "removeListeners"); if ((locationmanager != null) && (listener != null)) { locationmanager.removeUpdates(listener); } locationmanager = null; // Log.d(LOG_TAG, "Removed " + listener.toString()); } @Override public void onDestroy() { super.onDestroy(); removeListeners(); } Unfortunately, this finds the network provider, but only ever outputs latlng null 30 times - it never seems to get a location at all. I never even get a log statement of locationChanged. It's funny, because from ddms I can see output like: NetworkLocationProvider: onCellLocationChanged [305,8580] NetworkLocationProvider: getNetworkLocation(): returning cache location with accuracy 75.0 seeming to suggest that the network provider does have some location info after all, I'm just not getting at it. Can anyone help? I think working example code would be a useful resource for the Android/StackOverflow community.

    Read the article

  • What git branching models actually work - the final question

    - by UncleCJ
    In our company we have successfully deployed git and we are currently using a simple trunk/release/hotfixes branching model. However, this has it's problems, I have some key issues of confusion in the community which would be awesome to have answered here. Maybe my hopes for an Alexander stroke are too great, quite possibly I'll decompose this question into more manageable issues, but here's my first shot. Workflows / branching models - below are the three main descriptions of this I have seen, but they are partially contradicting each other or don't go far enough to sort out the subsequent issues we've run into (as described below). Thus our team so far defaults to not so great solutions. Are you doing something better? gitworkflows(7) Manual Page (nvie) A successful Git branching model (reinh) A Git Workflow for Agile Teams Merging vs rebasing (tangled vs sequential history) - the bids on this are as confusing as it gets. Should one pull --rebase or wait with merging back to the mainline until your task is finished? Personally I lean towards merging since this preserves a visual illustration of on which base a task was started and finished, and I even prefer merge --no-ff for this purpose. It has other drawbacks however. Also many haven't realized the useful property of merging - that it isn't commutative (merging a topic branch into master does not mean merging master into the topic branch). I am looking for a natural workflow - sometimes mistakes happen because our procedures don't capture a specific situation with simple rules. For example a fix needed for earlier releases should of course be based sufficiently downstream to be possible to merge upstream into all branches necessary (is the usage of these terms clear enough?). However it happens that a fix makes it into the master before the developer realizes it should have been placed further downstream, and if that is already pushed (even worse, merged or something based on it) then the option remaining is cherry-picking, with it's associated perils... What simple rules like such do you use? Also in this is included the awkwardness of one topic branch necessarily excluding other topic branches (assuming they are branched from a common baseline). Developers don't want to finish a feature to start another one feeling like the code they just wrote is not there anymore How to avoid creating merge conflicts (due to cherry-pick)? What seems like a sure way to create a merge conflict is to cherry-pick between branches, they can never be merged again? Would applying the same commit in revert (how to do this?) in either branch possibly solve this situation? This is one reason I do not dare to push for a largely merge-based workflow. How to decompose into topical branches? - We realize that it would be awesome to assemble a finished integration from topic branches, but often work by our developers is not clearly defined (sometimes as simple as "poking around") and if some code has already gone into a "misc" topic, it can not be taken out of there again, according to the question above? How do you work with defining/approving/graduating/releasing your topic branches? Proper procedures like code review and graduating would of course be lovely, but we simply cannot keep things untangled enough to manage this - any suggestions? integration branches, illustration please? Vote and comment as much as you'd like, I'll try to keep the issue page clear and informative enough. Thanks! Below is a list of related topics on stackoverflow I have checked out: What are some good strategies to allow deployed applications to be hotfixable? Workflow description for git usage for in-house development Git workflow for corporate Linux kernel development How do you maintain development code and production code? (thanks for this PDF!) git releases management Git Cherry-pick vs Merge Workflow How to cherry-pick multiple commits How do you merge selective files with git-merge? How to cherry pick a range of commits and merge into another branch ReinH Git Workflow git workflow for making modifications you’ll never push back to origin Cherry-pick a merge Proper Git workflow for combined OS and Private code? Maintaining Project with Git Why cant Git merge file changes with a modified parent/master. Git branching / rebasing good practices When will "git pull --rebase" get me in to trouble?

    Read the article

  • Windows Server 2008 Task Scheduler Tasks Not Executing

    - by omatase
    I've been having an intermittent problem for some time now with the Windows Task Scheduler that I can't work out. I use the task scheduler to run a C# app I've written that has various plugins used to ensure production systems are working. This task scheduler itself is actually a production system so I have one simple task that executes every 8 minutes to notify an external monitoring system that the task scheduler is still up. If this external service fails to receive an "all-clear" at least once every 15 minutes (or so I don't remember the exact number right now) it will message us that the monitoring system is down. In the past we've had intermittent "down" messages from time to time and each time I've investigated the cause I was unable to find any problems. So this time I wanted to ask the StackOverflow community since it doesn't look like I'll have luck on my own. This morning at 2:32 AM the task fired (exactly 8 minutes after the previous firing) however the task didn't fire again until 3:28. There are no errors that I can see in the Event Viewer at this time. When I look at the Task Scheduler log there are no errors there either. Here is what the log looks like though: Information 6/11/2011 3:28:56 AM 102 Task completed (2) d6cf2412-269e-48bf-9f40-4a863347baad Information 6/11/2011 3:28:56 AM 201 Action completed (2) d6cf2412-269e-48bf-9f40-4a863347baad Information 6/11/2011 3:28:55 AM 129 Created Task Process Info Information 6/11/2011 3:28:55 AM 200 Action started (1) d6cf2412-269e-48bf-9f40-4a863347baad Information 6/11/2011 3:28:55 AM 100 Task Started (1) d6cf2412-269e-48bf-9f40-4a863347baad Information 6/11/2011 3:28:55 AM 319 Task Engine received message to start task (1) Information 6/11/2011 3:28:55 AM 107 Task triggered on scheduler Info d6cf2412-269e-48bf-9f40-4a863347baad Information 6/11/2011 3:28:15 AM 102 Task completed (2) b91fe5ce-39ef-42fb-adbe-bd8be012c00a Information 6/11/2011 3:28:15 AM 201 Action completed (2) b91fe5ce-39ef-42fb-adbe-bd8be012c00a Information 6/11/2011 3:28:15 AM 102 Task completed (2) 556c07dc-2724-4a21-a97e-dc4abd56f94d Information 6/11/2011 3:28:15 AM 201 Action completed (2) 556c07dc-2724-4a21-a97e-dc4abd56f94d Information 6/11/2011 3:28:15 AM 102 Task completed (2) 79328289-f742-49dd-aa0d-c3d05db50895 Information 6/11/2011 3:28:15 AM 201 Action completed (2) 79328289-f742-49dd-aa0d-c3d05db50895 Information 6/11/2011 3:28:15 AM 102 Task completed (2) 19743755-47b6-4b98-9bec-052193be9496 Information 6/11/2011 3:28:15 AM 201 Action completed (2) 19743755-47b6-4b98-9bec-052193be9496 Information 6/11/2011 3:28:15 AM 102 Task completed (2) c165754f-e3e6-4176-a327-11f9c06c39a5 Information 6/11/2011 3:28:15 AM 201 Action completed (2) c165754f-e3e6-4176-a327-11f9c06c39a5 Information 6/11/2011 3:28:15 AM 102 Task completed (2) 0e62ad3e-1f6e-40c0-9155-19f0108dee22 Information 6/11/2011 3:28:15 AM 201 Action completed (2) 0e62ad3e-1f6e-40c0-9155-19f0108dee22 Information 6/11/2011 3:28:10 AM 129 Created Task Process Info Information 6/11/2011 3:28:10 AM 200 Action started (1) 0e62ad3e-1f6e-40c0-9155-19f0108dee22 Information 6/11/2011 3:28:10 AM 129 Created Task Process Info Information 6/11/2011 3:28:10 AM 200 Action started (1) c165754f-e3e6-4176-a327-11f9c06c39a5 Information 6/11/2011 3:28:10 AM 129 Created Task Process Info Information 6/11/2011 3:28:10 AM 200 Action started (1) 19743755-47b6-4b98-9bec-052193be9496 Information 6/11/2011 3:28:10 AM 129 Created Task Process Info Information 6/11/2011 3:28:10 AM 200 Action started (1) 79328289-f742-49dd-aa0d-c3d05db50895 Information 6/11/2011 3:28:10 AM 129 Created Task Process Info Information 6/11/2011 3:28:10 AM 200 Action started (1) 556c07dc-2724-4a21-a97e-dc4abd56f94d Information 6/11/2011 3:28:10 AM 129 Created Task Process Info Information 6/11/2011 3:28:10 AM 200 Action started (1) b91fe5ce-39ef-42fb-adbe-bd8be012c00a Information 6/11/2011 3:28:10 AM 100 Task Started (1) 0e62ad3e-1f6e-40c0-9155-19f0108dee22 Information 6/11/2011 3:28:10 AM 319 Task Engine received message to start task (1) Information 6/11/2011 3:28:10 AM 100 Task Started (1) c165754f-e3e6-4176-a327-11f9c06c39a5 Information 6/11/2011 3:28:10 AM 319 Task Engine received message to start task (1) Information 6/11/2011 3:28:10 AM 100 Task Started (1) 19743755-47b6-4b98-9bec-052193be9496 Information 6/11/2011 3:28:10 AM 319 Task Engine received message to start task (1) Information 6/11/2011 3:28:10 AM 100 Task Started (1) 79328289-f742-49dd-aa0d-c3d05db50895 Information 6/11/2011 3:28:10 AM 319 Task Engine received message to start task (1) Information 6/11/2011 3:28:10 AM 100 Task Started (1) 556c07dc-2724-4a21-a97e-dc4abd56f94d Information 6/11/2011 3:28:10 AM 319 Task Engine received message to start task (1) Information 6/11/2011 3:28:10 AM 100 Task Started (1) b91fe5ce-39ef-42fb-adbe-bd8be012c00a Information 6/11/2011 3:28:10 AM 319 Task Engine received message to start task (1) Information 6/11/2011 3:28:10 AM 107 Task triggered on scheduler Info 0e62ad3e-1f6e-40c0-9155-19f0108dee22 Information 6/11/2011 3:28:10 AM 107 Task triggered on scheduler Info c165754f-e3e6-4176-a327-11f9c06c39a5 Information 6/11/2011 3:28:10 AM 107 Task triggered on scheduler Info 19743755-47b6-4b98-9bec-052193be9496 Information 6/11/2011 3:28:10 AM 107 Task triggered on scheduler Info 79328289-f742-49dd-aa0d-c3d05db50895 Information 6/11/2011 3:28:10 AM 107 Task triggered on scheduler Info 556c07dc-2724-4a21-a97e-dc4abd56f94d Information 6/11/2011 3:28:10 AM 107 Task triggered on scheduler Info b91fe5ce-39ef-42fb-adbe-bd8be012c00a Information 6/11/2011 2:32:56 AM 102 Task completed (2) 16e4f2c3-a340-410a-9c14-4bfe0861fdd5 Information 6/11/2011 2:32:56 AM 201 Action completed (2) 16e4f2c3-a340-410a-9c14-4bfe0861fdd5 Information 6/11/2011 2:32:55 AM 129 Created Task Process Info Information 6/11/2011 2:32:55 AM 200 Action started (1) 16e4f2c3-a340-410a-9c14-4bfe0861fdd5 Information 6/11/2011 2:32:55 AM 100 Task Started (1) 16e4f2c3-a340-410a-9c14-4bfe0861fdd5 Information 6/11/2011 2:32:55 AM 319 Task Engine received message to start task (1) Information 6/11/2011 2:32:55 AM 107 Task triggered on scheduler Info 16e4f2c3-a340-410a-9c14-4bfe0861fdd5 Seems kind of strange. I also have two other C# apps that run and check something each hour on the hour using task scheduler. If I look at the history for those I can see that they didn't execute at 3 AM either! They all waited until 3:28 to start as well. If I look at "tasks completed" in the Event Viewer it shows that only one task was able to run between the 2:32 AM to 3:28 AM time period. The task was "\Microsoft\Windows\RAC\RACAgent" And here's what it looked like: Information 6/11/2011 3:18:09 AM 102 Task completed (2) 00c53a85-ba20-4666-80db-fbbe2492c0ad Information 6/11/2011 3:18:09 AM 201 Action completed (2) 00c53a85-ba20-4666-80db-fbbe2492c0ad Information 6/11/2011 3:18:08 AM 129 Created Task Process Info Information 6/11/2011 3:18:08 AM 200 Action started (1) 00c53a85-ba20-4666-80db-fbbe2492c0ad Information 6/11/2011 3:18:08 AM 100 Task Started (1) 00c53a85-ba20-4666-80db-fbbe2492c0ad Information 6/11/2011 3:18:08 AM 319 Task Engine received message to start task (1) I appreciate any ideas anyone may have.

    Read the article

  • Unable to SSH into EC2 instance on Fedora 17

    - by abhishek
    I did following steps But I am not able to SSH to it(Same steps work fine on Fedora 14 image). I am getting Permission denied (publickey,gssapi-keyex,gssapi-with-mic) I created new instance using fedora 17 amazon community image(ami-2ea50247). I copied my ssh keys under /home/usertest/.ssh/ after creating a usertest I have SELINUX=disabled here is Debug info: $ ssh -vvv ec2-54-243-101-41.compute-1.amazonaws.com ssh -vvv ec2-54-243-101-41.compute-1.amazonaws.com OpenSSH_5.2p1, OpenSSL 1.0.0b-fips 16 Nov 2010 debug1: Reading configuration data /etc/ssh/ssh_config debug1: Applying options for * debug2: ssh_connect: needpriv 0 debug1: Connecting to ec2-54-243-101-41.compute-1.amazonaws.com [54.243.101.41] port 22. debug1: Connection established. debug1: identity file /home/usertest/.ssh/identity type -1 debug1: identity file /home/usertest/.ssh/id_rsa type -1 debug3: Not a RSA1 key file /home/usertest/.ssh/id_dsa. debug2: key_type_from_name: unknown key type '-----BEGIN' debug3: key_read: missing keytype debug2: key_type_from_name: unknown key type 'Proc-Type:' debug3: key_read: missing keytype debug2: key_type_from_name: unknown key type 'DEK-Info:' debug3: key_read: missing keytype debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug3: key_read: missing whitespace debug2: key_type_from_name: unknown key type '-----END' debug3: key_read: missing keytype debug1: identity file /home/usertest/.ssh/id_dsa type 2 debug1: Remote protocol version 2.0, remote software version OpenSSH_5.9 debug1: match: OpenSSH_5.9 pat OpenSSH* debug1: Enabling compatibility mode for protocol 2.0 debug1: Local version string SSH-2.0-OpenSSH_5.2 debug2: fd 3 setting O_NONBLOCK debug1: SSH2_MSG_KEXINIT sent debug1: SSH2_MSG_KEXINIT received debug2: kex_parse_kexinit: diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1 debug2: kex_parse_kexinit: ssh-rsa,ssh-dss debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,[email protected] debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,[email protected] debug2: kex_parse_kexinit: hmac-md5,hmac-sha1,[email protected],hmac-ripemd160,[email protected],hmac-sha1-96,hmac-md5-96 debug2: kex_parse_kexinit: hmac-md5,hmac-sha1,[email protected],hmac-ripemd160,[email protected],hmac-sha1-96,hmac-md5-96 debug2: kex_parse_kexinit: none,[email protected],zlib debug2: kex_parse_kexinit: none,[email protected],zlib debug2: kex_parse_kexinit: debug2: kex_parse_kexinit: debug2: kex_parse_kexinit: first_kex_follows 0 debug2: kex_parse_kexinit: reserved 0 debug2: kex_parse_kexinit: diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1 debug2: kex_parse_kexinit: ssh-rsa,ssh-dss debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,[email protected] debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,[email protected] debug2: kex_parse_kexinit: hmac-md5,hmac-sha1,[email protected],hmac-sha2-256,hmac-sha2-256-96,hmac-sha2-512,hmac-sha2-512-96,hmac-ripemd160,[email protected],hmac-sha1-96,hmac-md5-96 debug2: kex_parse_kexinit: hmac-md5,hmac-sha1,[email protected],hmac-sha2-256,hmac-sha2-256-96,hmac-sha2-512,hmac-sha2-512-96,hmac-ripemd160,[email protected],hmac-sha1-96,hmac-md5-96 debug2: kex_parse_kexinit: none,[email protected] debug2: kex_parse_kexinit: none,[email protected] debug2: kex_parse_kexinit: debug2: kex_parse_kexinit: debug2: kex_parse_kexinit: first_kex_follows 0 debug2: kex_parse_kexinit: reserved 0 debug2: mac_setup: found hmac-md5 debug1: kex: server->client aes128-ctr hmac-md5 none debug2: mac_setup: found hmac-md5 debug1: kex: client->server aes128-ctr hmac-md5 none debug1: SSH2_MSG_KEX_DH_GEX_REQUEST(1024<1024<8192) sent debug1: expecting SSH2_MSG_KEX_DH_GEX_GROUP debug2: dh_gen_key: priv key bits set: 131/256 debug2: bits set: 506/1024 debug1: SSH2_MSG_KEX_DH_GEX_INIT sent debug1: expecting SSH2_MSG_KEX_DH_GEX_REPLY debug3: check_host_in_hostfile: filename /home/usertest/.ssh/known_hosts debug3: check_host_in_hostfile: match line 17 debug3: check_host_in_hostfile: filename /home/usertest/.ssh/known_hosts debug3: check_host_in_hostfile: match line 17 debug1: Host 'ec2-54-243-101-41.compute-1.amazonaws.com' is known and matches the RSA host key. debug1: Found key in /home/usertest/.ssh/known_hosts:17 debug2: bits set: 500/1024 debug1: ssh_rsa_verify: signature correct debug2: kex_derive_keys debug2: set_newkeys: mode 1 debug1: SSH2_MSG_NEWKEYS sent debug1: expecting SSH2_MSG_NEWKEYS debug2: set_newkeys: mode 0 debug1: SSH2_MSG_NEWKEYS received debug1: SSH2_MSG_SERVICE_REQUEST sent debug2: service_accept: ssh-userauth debug1: SSH2_MSG_SERVICE_ACCEPT received debug2: key: /home/usertest/.ssh/identity ((nil)) debug2: key: /home/usertest/.ssh/id_rsa ((nil)) debug2: key: /home/usertest/.ssh/id_dsa (0x7f904b5ae260) debug1: Authentications that can continue: publickey,gssapi-keyex,gssapi-with-mic debug3: start over, passed a different list publickey,gssapi-keyex,gssapi-with-mic debug3: preferred gssapi-with-mic,publickey,keyboard-interactive,password debug3: authmethod_lookup gssapi-with-mic debug3: remaining preferred: publickey,keyboard-interactive,password debug3: authmethod_is_enabled gssapi-with-mic debug1: Next authentication method: gssapi-with-mic debug3: Trying to reverse map address 54.243.101.41. debug1: Unspecified GSS failure. Minor code may provide more information Credentials cache file '/tmp/krb5cc_500' not found debug1: Unspecified GSS failure. Minor code may provide more information Credentials cache file '/tmp/krb5cc_500' not found debug1: Unspecified GSS failure. Minor code may provide more information debug2: we did not send a packet, disable method debug3: authmethod_lookup publickey debug3: remaining preferred: keyboard-interactive,password debug3: authmethod_is_enabled publickey debug1: Next authentication method: publickey debug1: Trying private key: /home/usertest/.ssh/identity debug3: no such identity: /home/usertest/.ssh/identity debug1: Trying private key: /home/usertest/.ssh/id_rsa debug3: no such identity: /home/usertest/.ssh/id_rsa debug1: Offering public key: /home/usertest/.ssh/id_dsa debug3: send_pubkey_test debug2: we sent a publickey packet, wait for reply debug1: Authentications that can continue: publickey,gssapi-keyex,gssapi-with-mic debug2: we did not send a packet, disable method debug1: No more authentication methods to try. Permission denied (publickey,gssapi-keyex,gssapi-with-mic).

    Read the article

< Previous Page | 247 248 249 250 251 252 253 254 255 256 257 258  | Next Page >