Search Results

Search found 31207 results on 1249 pages for 'atg best practice in industries'.

Page 368/1249 | < Previous Page | 364 365 366 367 368 369 370 371 372 373 374 375  | Next Page >

  • SQL SERVER – Difference between DATABASEPROPERTY and DATABASEPROPERTYEX

    - by pinaldave
    Earlier I asked a simple question on Facebook regarding difference between DATABASEPROPERTY and DATABASEPROPERTYEX in SQL Server. You can view the original conversation there over here. The conversion immediately became very interesting and lots of healthy discussion happened on facebook page. The best part of having conversation on facebook page is the comfort it provides and leaner commenting interface. Question Question from SQLAuthority.com: What is the difference between DATABASEPROPERTY and DATABASEPROPERTYEX in SQL Server? Answer Answer from Rakesh Kumar: DATABASEPROPERTY is supported for backward compatibility but does not provide information about the properties added in this release. Also, many properties supported by DATABASEPROPERTY have been replaced by new properties in DATABASEPROPERTYEX.- source (MSDN). Answer from Alphonso Jones: The only real difference I can see is one, the number of properties contained and the other is that EX returns a sql_variant while DATABASEPROPERTY returns only int. Answer from Ambati Venkatasiva: Both are system meta data functions. DATABASEPROPERTYEX Returns the current setting of the specified database option. DATABASEPROPERTYEX returns the sq-varient value and DATABASEPROPERTY returns integer value. Answer from Rama Sankar Molleti:  Here is the best example about databasepropertyex SELECT DATABASEPROPERTYEX('dbname', 'Collation') Result SQL_1xCompat_CP850_CI_AS Whereas with databaseproperty it retuns nothing as the return type for this is integer. Sql_variant datatype stores values of various sql server supported datatypes except text, ntext, image and timestamp. Answer from Alok Seth:  SELECT DATABASEPROPERTYEX('AdventureWorks', 'Status') DatabaseStatus_DATABASEPROPERTYEX GO --Result - ONLINE SELECT DATABASEPROPERTY('AdventureWorks', 'Status') DatabaseStatus_DATABASEPROPERTY GO --Result - NULL Summary Use DATABASEPROPERTYEX as it is the only function supported in future version as well it returns status of various database properties which does not exists with DATABASEPROPERTY. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Connecting Small business network to Azure Site to Site VPN

    - by MarkKGreenway
    Would like to have connectivity between azure virtual machines and on LAN users. My current network has a Cisco ISA550 connected to the WAN (one Ethernet cable into the office the fiber transceiver is on a different floor)and any public servers can be one-to one NAT-ed to have a public and private IP. What is the best way to get a reliable connection. Between end users and the cloud? I want to know the preferred on site endpoint. Do the azure vm's have to have a local ip in the LAN subnet? (Right now 10.10.0.0/20 or 255.255.240.0 to give room if this is the case). If in purchased an asa550 would I put it behind or in front of the isa550. Would it be ahead or peer with the users switches? What is the best way to get a reliable connection. Between end users and the cloud servers?

    Read the article

  • Solaris Tech Day mit Engineering 3.12. Frankfurt

    - by Franz Haberhauer
    Am Dienstag, den 3. Dezember 2013 haben wir den Chef des Solaris Engineering Markus Flierl mit einigen seiner Engineers und Joost Pronk vom Produkt Management zu Gast in unserer Geschäftstelle in Dreieich (Frankfurt). Wir nutzen diese Gelegenheit, Ihnen bei einem Solaris Tech Day direkt von der Quelle tiefe Einblicke in Solaris-Technologien zu geben: Agenda Time Session Speaker 09:00 Registration and Breakfast 09:45 Oracle Solaris - Strategy, Engineering Insights, Roadmap, and a Glimpse on Solaris in Oracle's IT Markus Flierl 11:15 Coffee 11:35 Oracle Solaris 11.1: The Best Platform for Oracle - The Technologies Behind the Scenes Bart Smaalders 12:35 Lunch 13:25 Solaris Security: Reduce Risk , Deliver Secure Services, and Monitor Compliance Darren Moffat 14:10 Solaris 11 Provisioning and SMF - Insights from the Lead Engineers Bart Smaalders & Liane Praza 14:55 Solaris Data Management - ZFS, NFS, dNFS, ASM, and OISP Integration with the Oracle DB Darren Moffat 15:25 Coffee 15:45 Solaris 10 Patches and Solaris SRUs - News and Best Practices Gerry Haskins 16:30 Cloud Formation: Implementing IaaS in Practice with Oracle Solaris Joost Pronk 17:00 Q&A panel - All presenters and Solaris engineers Bitte registrieren Sie sich hier, um sich einen Platz bei dieser außergewöhnlichen Veranstaltung zu sichern. Es lohnt sich übrigens auch mal in die Blogs von  Markus Flierl mit einem interessanten Beitrag zu Eindrücken und Ausblicken von der Oracle Open World 2013 oder den von  Darren Moffat zu schauen. Gerry Haskins schreibt als Director Solaris Lifecycle Engineering gleich in zwei Blogs - der Patch Corner mit Schwerpunkt Solaris 10 und dem Solaris 11 Maintenance Lifecycle. Bereits in der kommenden Woche findet in Nürnberg die DOAG 2013 Konferenz und Ausstellung mit einem breiten Spektrum an Vorträgen rund um Solaris statt - insbesondere auch mit vielen Erfahrungsberichten aus der Praxis.

    Read the article

  • Satellite TV on my PC?

    - by jasondavis
    What would be a good solution for hooking up the satellite TV box (Dish network) in my room to be able to watch it on my PC and possibly record video from it? Please share with me the best cards, cables, software, anything else needed to do this in the most efficient way? I am looking at the WinTV-HVR from Hauppauge. I am not sure, for performance, what would be the best to go with, PCI, PCI express, USB2.0? Also on the Hauppauge website I saw this note: "WinTV-PVR products will not work in PC systems with 4GB or more of memory." The new PC I am building will be 12-24gb of DDR3 RAM, does that mean their products will not work at all with my memory? So confused now!

    Read the article

  • DON'T MISS THE ORACLE LINUX GENERAL SESSION @ORACLE OPENWORLD

    - by Zeynep Koch
    We have had great sessions today at Openworld but tomorrow will be even better. The session that you should not miss is : Tuesday, Oct 2nd : General Session: Oracle Linux Strategy and Roadmap   10:15am, Moscone South #103   Wim Coekaerts, Sr.VP, Oracle Linux and Virtualization Engineering will talk about what Oracle Linux strategy and what is coming in the next 12 months. This is one session you should not miss and people are already registering. Stop by to hear Wim and ask questions about Linux development Top Technical Tips for Automatic and Secure Oracle Linux Deployments,  11:45am, Moscone South # 270 In this session, you will hear about deployment best practices and tips from Lenz Grimmer from Oracle and two Linux customers, Martin Breslin from SEI and Ed Bailey from Transunion talk about their experiences and insights Why Switch to Oracle Linux?, 3:30pm, Moscone South #270 In this session you will learn why Oracle Linux is best for your enterprise. There will be an Oracle speaker and Mike Radomski from SUNY talk about why they chose Oracle Linux. Please also visit the Oracle Linux Pavilion. If you stop by in one of our Partners booth you can be in the drawing for this beautiful, plush penguin. See you all tomorrow.

    Read the article

  • Post Crosstalk 2012

    - by David Dorf
    This year the Oracle Retail users conference, Crosstalk, had a 20% increase in attendees, which was driven by both new customers and those acquired via Endeca.  As the product assets of Oracle have grown, so has the completeness of the solution set.  This year was marked by the breadth of omni-channel stories. Rose Spicer and her marketing team (see photo on left) always strive for an equal balance of retailer presentations, networking opportunities, and unique experiences -- this year was no exception.  We had 41 different retailers from China, Russia, South Africa, Brazil, Chile, US, Canada and the UK sharing their insights with one another. In all there were 251 executives from 120 iconic brands such as Daphne, Kohl's, Morrisons, Abercrombie & Fitch, Hot Topic, Talbots, Petco, Deckers, Sportmaster, Mr. Price, Falabella, and Disney to name a few. From a product perspective, there were a few new developments from Oracle Retail: Endeca's search engine has been integrated into the ATG commerce platform. The latest Retail Analytics application, Oracle Retail Customer Analytics, is generally available. Oracle Retail previewed a new fully-integrated mobile POS. But the real benefit of attending Crosstalk was hearing about the experiences of retailers and partners.  Here are are a few interesting facts I picked up: At Kohl's, the most popular website accessed by customers within their stores is Facebook.  With all the buzz about showrooming, I was really expecting it to be Amazon. Daphne, a Chinese shoe retailer, is opening 3 new stores per day.  Being located near the factories allows them to have a very agile supply chain as well. Disney Stores have increased sales by 25% at stores upgraded to include Mobile POS.  They continue to lead the pack with excellent customer experiences. Quicksilver reported that 1 in 5 visits to their website comes from a tablet.  More evidence that tablets are replacing traditional PCs in households. By tagging shoes with RFID, Saks is able to ensure all shoe models are on display.  If a model is not being displayed, it has no chance of being sold. Additionally, there were awards, store tours on Michigan Avenue, fireworks at Navy Pier, and the Oracle Retail house band, Bolo313, performing at Solider Field.  Speaking of which, a few retailers got on stage and jammed with band -- possible rival to Rock & Roll Retail? You can always find the latest info from us at the Retail Rack. The next events on tap are the Partner Summit followed by OpenWorld.

    Read the article

  • My Sessions at Oracle OpenWorld 2012

    - by Ravi Sankaran
    I have 2 sessions at Oracle OpenWorld 2012. Oracle Fusion Applications: Customizing and Extending Business Processes Rajesh Raheja - Senior Director, Product Management for Oracle Fusion Middleware Business Integration joins me  to talk about the approaches in customizing and extending Oracle Fusion Applications with Oracle SOA Suite. CON8719 When: Monday, Oct 1, 4:45 PM – 5:45 PM Where: Palace Hotel – Twin Peaks North Oracle Fusion Applications: Best Practices in Integration Design Patterns I will be join Rajesh Raheja to provide a high level view of the Oracle Fusion Applications integration strategy and showing the best practice integration design patterns. You will learn how to discover integration assets, invoke web services and use cloud data integration. The session is not just limited to SaaS deployments, but will be useful for on-premises customers as well. CON8685 When: Tuesday, Oct 2, 1:15 PM – 2:15 PM Where: Palace Hotel – Telegraph I will also be at the SOA Customer Advisory Board on Thursday. Here is another session that  I would want to strongly recommend. This is a session that discusses how Oracle SOA Suite could be used to integrate applications with the ones on the cloud. How to Integrate Cloud Applications with Oracle SOA Suite Rajesh Raheja will be joined by Geeta Pyne (Director, Middleware at BMC Software) to address cloud integration challenges and how Oracle SOA Suite can help with a consistent approach to integration, whether on-premises or cloud. I am quite excited about this session as we will tackle the hype and myth of “simple” cloud integrations and share real-life application integration experiences. Don’t miss this one! CON8968 When: Tuesday, Oct 2, 11:45 AM – 12:45 PM Where: Moscone West – 3003 See you at Oracle Open World!

    Read the article

  • The Value of SOA Specialization - Fujitsu

    - by Jürgen Kress
    Thanks for the nice ink The Value of Specialization In my last post  I talked about Fujitsu's achievement in obtaining SOA and other specializations, but I have heard murmurings from other partners about what just is the value? I think Oracle have to do more to advertise the benefits to customers, we need to see customers asking for specialization for it to really work, but Oracle have made great promises about only recommending those partners who are specialized. For us there was another benefit. Oracle was sponsoring the 3rd Annual SOA Symposium in Berlin and invited us as their first specialized partner to take part. There is a great blog about the symposium on the SOA community blog site. This is real commitment from Oracle and we have other marketing opportunities being worked on with Jürgen. This does generate leads so my message to other Oracle Partners is, you need to do this, it is worthwhile.   Fujitsu - First SOA Specialized Partner Globally Just before Oracle Open World I found out that Fujitsu had achieved the first SOA Specialization globally. I think most partners know what the requirements are for Specialization and that in itself is challenging but the bureaucracy around the actual submission is an exercise in tenacity. I won’t go into that now; I have had my dig at Oracle this month, but enough to say the process could be improved. As a platinum partner we needed 5 specializations and we decided to go for SOA first. The reasoning behind this is that our Oracle Practice is known for being applications centric. We have always had an excellent technical capability but no one ever talked about that, it was just part and parcel of an implementation. However today we have just as many bids that are technology lead as there is applications lead, so it seemed a good plan to work on the areas we were not known for. We appointed a capability lead to be responsible for putting the team through the training and testing and Rosemary (Kell) was excellent, she ensured that everyone was on track and that it wasn’t just getting put into the ‘to do list’. In Fujitsu everyone in the Oracle Practice has an objective to achieve the competency tests in their area, so achieving the 2 pre sales, 2 sales and 1 support was no problem at all. We actually had 22 with the support capability proficiency.  The implementation specialist exams are much harder, more like OCP in the database area. We had help from the Oracle SOA Community; Jürgen Kress who runs this in EMEA is really motivational. At the time we started SOA was a beta exam which means you do not get the results immediately but again we put forward more than we needed. Manjit Chopra, Sukhraj Sahota, Emely Patra, Ian Scorrer and Sunny Sidhu all took the exam and eventually got the results they wanted they had passed. Congratulations. Here is Jurgen expalining why specialization is important. After the tests came the submissions where you need to include deals and experience, this was my bit, and persuading Oracle we really deserved the specialization. Finally we got the news we had been awarded the specialization, and a few days later that we were first globally. I am very proud. However there is no rest for the wicked and we plodded on to make the 5 specializations needed for Platinum and now we are working on the new Diamond status and I think SOA will be one of our 5 ‘super specializations’. This is a global Fujitsu initiative and I work closely with my colleague in Germany Jessika Weiss. It was nice to be able to have a press release about this and a comment from Judson Althoff  head of Oracle Alliances. For more information on SOA Specialization and the SOA Partner Community please feel free to register at www.oracle.com/goto/emea/soa (OPN account required) Blog Twitter LinkedIn Mix Forum Wiki Website Technorati Tags: SOA,SOA Community,OPN,Oracle,Fujitsu,Debra Lilley,Jürgen Kress,Specialization,SOA Specialization

    Read the article

  • Ajax application: using SOAP vs REST ?

    - by coder
    I'm building an ajax heavy application (client-side strictly html/css/js) which will be getting all the data and using server business logic via webservices. I know REST seems to be the hot topic but I can't find any good arguments. The main argument seems to be its "light-weight". My impression so far is that wsdl/soap based services are more expressive and allow for more a more complex transfer of data. It appears that soap would be more useful in the application I'm building where the only code consuming the services will be the js downloaded in the client browser. REST on the other hand seems to have a smaller entry barrier and so can be more useful for services like twitter in allowing other developers to consume these services easily. Also, REST seems to Te better suited for simple data transfers. So in summary SOAP is useful for complex data transfer and REST is useful in simple data transfer. I'm currently under the impression that using SOAP would be best due to the complexity of the messages but perhaps there's other factors. What are your thoughts on the pros/cons of soap/rest for a heavy ajax web app? EDIT: While the wsdl is in xml, the data I'm transferring back and forth is actually in JSON. It just appears more natural to use wsdl/soap here due to the nature of the app. The verbs GET and POST may not be enough. I may want to say something like: processQueue, or executeTimer. This is why my conclusion has been wsdl/soap would be good for bridging a complex layer between two applications (client and server) whereas REST would be better (due to its simplicity) for allowing many developer-users to consume resources programmatically. So you could say the choice falls along two lines Will the app be verb-oriented (completing tasks: use soap) or noun-oriented (consuming resources: use REST) Will the api be consumed by few developers or many developers (REST is strong for many developers)? Since such an ajax heavy app would potentially use many verbs and would only be used by the client developer it appears soap/wsdl would be the best fit.

    Read the article

  • E-Seminars para Parceiros - Mar-Abr/10

    - by Claudia Costa
    Para se inscrever nas formações que se encontram abaixo por favor utilize os links de registo indicados. NOME                     DATA                  DURAÇÃO LOCAL   Oracle Real-Time Decisions - Implementation Best Practices 21.04.2010        1 hora/dia            Início: 15:00h on-line Oracle WebLogic Suite 11g Overview & Proficiency Series   15,26,29,30.03.2010 1 hora/dia Início: 09:00h on-line Upgrade to Oracle WebLogic Suite 11g   19.03.2010 1 hora Início: 09:00h on-line Oracle Real-Time Decisions: Introduction to Real-Time Decisions   9.04.2010 1 hora Início: 15:00h on-line Best Strategies for Migrating from Teradata to Oracle Exadata   18.03.2010 1 hora/ Início: 15:00h on-line Oracle Database Awareness - 11gR2 Features for Data Warehouse and OLAP   19.03.2010 1 hora Início: 15:00h on-line Oracle Universal Content Management (UCM) eSeminar Series   2,25.03.2010 1 hora/dia Início: 09:00h on-line Oracle Information Rights Management Overview   17.03.2010 1 hora Início: 15:00h on-line   Para mais informações contacte Melissa Lopes - Tel: 214235194  

    Read the article

  • where to look for computer technician jobs

    - by Kareem
    Hi I am currently studying for the A+ certification, I plan to have it by the end of this month and I plan to go for farther education. I’ve built two high end computers by myself for a friend and family member. Install OS and everything. I’m looking in to finding either a computer assembly or computer technician job . Where is the best place to look for one? I’ve looked in to best buy but I find their geek squad to be a little bit shady. Where is a good place to look for a full time entry level computer technician job just starting out in Tampa, FL?

    Read the article

  • I/O intensive MySql server on Amazon AWS

    - by rhossi
    We recently moved from a traditional Data Center to cloud computing on AWS. We are developing a product in partnership with another company, and we need to create a database server for the product we'll release. I have been using Amazon Web Services for the past 3 years, but this is the first time I received a spec with this very specific hardware configuration. I know there are trade-offs and that real hardware will always be faster than virtual machines, and knowing that fact forehand, what would you recommend? 1) Amazon EC2? 2) Amazon RDS? 3) Something else? 4) Forget it baby, stick to the real hardware Here is the hardware requirements This server will be focused on I/O and MySQL for the statistics, memory size and disk space for the images hosting. Server 1 I/O The very main part on this server will be I/O processing, FusionIO cards have proven themselves extremely efficient, this is currently the best you can have in this domain. o Fusion ioDrive2 MLC 365GB (http://www.fusionio.com/load/-media-/1m66wu/docsLibrary/FIO_ioDrive2_Datasheet.pdf) CPU MySQL will use less CPU cores than Apache but it will use them very hard, the E7 family has 30M Cache L3 wichi provide boost performance : o 1x Intel E7-2870 will be ok. Storage SAS will be good enough in terms of performance, especially considering the space required. o RAID 10 of 4 x SAS 10k or 15k for a total available space of 512 GB. Memory o 64 GB minimum is required on this server considering the size of the statistics database. Warning: the statistics database will grow quickly, if possible consider starting with 128 GB directly, it will help. This server will be focused on I/O and MySQL for the statistics, memory size and disk space for the images hosting. Server 2 I/O The very main part on this server will be I/O processing, FusionIO cards have proven themselves extremely efficient, this is currently the best you can have in this domain. o Fusion ioDrive2 MLC 365GB (http://www.fusionio.com/load/-media-/1m66wu/docsLibrary/FIO_ioDrive2_Datasheet.pdf) CPU MySQL will use less CPU cores than Apache but it will use them very hard, the E7 family has 30M Cache L3 wichi provide boost performance : o 1x Intel E7-2870 will be ok. Storage SAS will be good enough in terms of performance, especially considering the space required. o RAID 10 of 4 x SAS 10k or 15k for a total available space of 512 GB. Memory o 64 GB minimum is required on this server considering the size of the statistics database. Warning: the statistics database will grow quickly, if possible consider starting with 128 GB directly, it will help. Thanks in advance. Best,

    Read the article

  • Multidimensional Thinking–24 Hours of Pass: Celebrating Women in Technology

    - by smisner
    It’s Day 1 of #24HOP and it’s been great to participate in this event with so many women from all over the world in one long training-fest. The SQL community has been abuzz on Twitter with running commentary which is fun to watch while listening to the current speaker. If you missed the fun today because you’re busy with all that work you’ve got to do – don’t despair. All sessions are recorded and will be available soon. Keep an eye on the 24 Hours of Pass page for details. And the fun’s not over today. Rather than run 24 hours consecutively, #24HOP is now broken down into 12-hours over two days, so check out the schedule to see if there’s a session that interests you and fits your schedule. I’m pleased to announce that my business colleague Erika Bakse ( Blog | Twitter) will be presenting on Day 2 – her debut presentation for a PASS event. (And I’m also pleased to say she’s my daughter!) Multidimensional Thinking: The Presentation My contribution to this lineup of terrific speakers was Multidimensional Thinking. Here’s the abstract: “Whether you’re developing Analysis Services cubes or creating PowerPivot workbooks, you need to get into a multidimensional frame of mind to produce a model that best enables users to answer their business questions on their own. Many database professionals struggle initially with multidimensional models because the data modeling process is much different than the one they use to produce traditional, third normal form databases. In this session, I’ll introduce you to the terminology of multidimensional modeling and step through the process of translating business requirements into a viable model.” If you watched the presentation and want a copy of the slides, you can download a copy here. And you’re welcome to download the slides even if you didn’t watch the presentation, but they’ll make more sense if you did! Kimball All the Way There’s only so much I can cover in the time allotted, but I hope that I succeeded in my attempt to build a foundation that prepares you for starting out in business intelligence. One of my favorite resources that will get into much more detail about all kinds of scenarios (well beyond the basics!) is The Data Warehouse Toolkit (Second Edition) by Ralph Kimball. Anything from Kimball or the Kimball Group is worth reading. Kimball material might take reading and re-reading a few times before it makes sense. From my own experience, I found that I actually had to just build my first data warehouse using dimensional modeling on faith that I was going the right direction because it just didn’t click with me initially. I’ve had years of practice since then and I can say it does get easier with practice. The most important thing, in my opinion, is that you simply must prototype a lot and solicit user feedback, because ultimately the model needs to make sense to them. They will definitely make sure you get it right! Schema Generation One question came up after the presentation about whether we use SQL Server Management Studio or Business Intelligence Development Studio (BIDS) to build the tables for the dimensional model. My answer? It really doesn’t matter how you create the tables. Use whatever method that you’re comfortable with. But just so happens that it IS possible to set up your design in BIDS as part of an Analysis Services project and to have BIDS generate the relational schema for you. I did a Webcast last year called Building a Data Mart with Integration Services that demonstrated how to do this. Yes, the subject was Integration Services, but as part of that presentation, I showed how to leverage Analysis Services to build the tables, and then I showed how to use Integration Services to load those tables. I blogged about this presentation in September 2010 and included downloads of the project that I used. In the blog post, I explained that I missed a step in the demonstration. Oops. Just as an FYI, there were two more Webcasts to finish the story begun with the data – Accelerating Answers with Analysis Services and Delivering Information with Reporting Services. If you want to just cut to the chase and learn how to use Analysis Services to build the tables, you can see the Using the Schema Generation Wizard topic in Books Online.

    Read the article

  • Is my current employer expecting too much?

    - by priyank patel
    This is my first job as a programmer.I am working on ASP.NET/C#,HTML,CSS,Javascript/Jquery. I am working for a firm which develops software for small banking firms. Currently they have their software running in 100 firms.Their software is developed in Visual Fox Pro. I was hired to develop online version of this software.I am the solo developer. My boss is another developer.So my company has two developers. My boss doesnot have any idea about .NET development.I am working on their project since 8 months.The progress is surely there but not very big. I try my best to do what my boss asks.But the project just seems too ambitious for me. The company doesnot have any planning for the project.They just ask me to develop what their older software provides.So I have to deal with front end , back end,review codes , design architecture and etc. I have decided to give my best.I try a lot.But the project sometimes just seems to be overwhelming. So my questions is , is it normal for a programmer to be in this place. I always feel the need to work in atleast a small team if not big one. Are my employers just expecting too much of a fresher.Or is that I being a programmer am lacking the skills to deal with this. I am just not able judge my condition.Also I am paid very low salary.I do work on saturday as well. Can anyone just help me judge this scenario? Any suggestions are welcome.

    Read the article

  • Sizing a Virtual Server

    - by vdubs
    I would like to replace four aging physical servers with one virtual server. What is the best way to insure the VM server is sized correctly? The requirements of the apps that will be running on the four servers are APPLICATION SERVERS - QTY 3 - These will run the application layer for the web server, Business Objects Business Intelligence app, and various other small client server apps. The three most heavy hitting apps each have the following server requirements. So, if I bought three physical servers, this would be the requirements for each of them Processor - Dual 2.83 GHZ (or faster) Ram - 4 GB Raid 5 - 50-100GB usable space NIC - 1 GB Web Server - this will run one asp.net e-business app that will talk to our dedicated SQL server and the three app servers above. The E-Business software has these requirements for the web server Processor - Quad 2.83 GHZ (or faster) Ram - 8 GB Raid 5 - 50-100GB usable space NIC - 1 GB What is the best tool to determine what I need from a hardware standpoing in a virtual server? I am planning on using VMWare.

    Read the article

  • Is Joel Test really a good gauging tool?

    - by henry
    I just learned about Joel Test. I have been computer programmer for 22 years, but somehow never heard about it before. I consider my best job so far to be this small investment managing company with 30 employees and only 3 people in IT department. I am no longer with them but I had being working there for 5 years – my longest streak with any given company. To my surprise they scored extremely poor on Joel Test. The only two questions I would answer “yes” are #4: Do you have a bug database? And #9: Do you use the best tools money can buy? Everything else is either “sometimes” or straight “no”. Here is what I liked about the company however: a) Good pay, they bragged about it to my face and I bragged about it to their face, so it was almost like a family environment. b) I always knew big picture. When writing a code to solve particular problem there were no ambiguity about the business nature of that problem. Even though we did not always had written specs we could ask business users a question anytime, often yelling it across the floor. I could even talk to executives any time I felt like doing it: no appointment necessary. c) Immediate feedback. Once we implement a solution and make business users happy they immediately let us know that, we (programmers) become heroes of the moment. d) No red tape. I could always buy any tools I deem necessary, and design solutions the way my professional judgment dictates. e) Flexibility. If I had mid-day dental appointment that is near my house rather than near the office, I would send email to the company: "FYI: I work from home today". As long as one of 3 IT guys was on the floor (to help traders in case their monitors go dark) they did not care where 2 others are. So the question thus becomes how valuable Joel Test is? Why bother with it?

    Read the article

  • How much detail is in a good UI regression test?

    - by GlenPeterson
    We use a detailed step-by-step user-interface regression test for our commercial web application. It has a "backbone" test for the most used / most important parts of the system, with optional tests for specific areas of functionality. Using this plan has definitely helped us ensure high quality software. But, having very specific tests can be counter-productive. The tester concentrates on following the test and will completely miss usability issues, or not notice fairly obvious problems such as the bottom part of a page that is missing. By contrast, some of the best UI testing happens when building a demo of a new feature. I often do my own best testing by pretending to demonstrate the system to an imaginary prospect. Yet when I tell the testers, "Just demonstrate the system to yourself" they don't cover nearly as much functionality as they do with a detailed point-by-point test. I'm repeatedly asked to provide more and more detail in the test plan so that a new untrained tester can test with it without asking any questions. Yet details seem to be counter-productive. How much detail do you put in a regression test to make it effective? What techniques make the tester to focus more on the system than on checking off items on the test?

    Read the article

  • Managing JS and CSS for a static HTML web application

    - by Josh Kelley
    I'm working on a smallish web application that uses a little bit of static HTML and relies on JavaScript to load the application data as JSON and dynamically create the web page elements from that. First question: Is this a fundamentally bad idea? I'm unclear on how many web sites and web applications completely dispense with server-side generation of HTML. (There are obvious disadvantages of JS-only web apps in the areas of graceful degradation / progressive enhancement and being search engine friendly, but I don't believe that these are an issue for this particular app.) Second question: What's the best way to manage the static HTML, JS, and CSS? For my "development build," I'd like non-minified third-party code, multiple JS and CSS files for easier organization, etc. For the "release build," everything should be minified, concatenated together, etc. If I was doing server-side generation of HTML, it'd be easy to have my web framework generate different development versus release HTML that includes multiple verbose versus concatenated minified code. But given that I'm only doing any static HTML, what's the best way to manage this? (I realize I could hack something together with ERB or Perl, but I'm wondering if there are any standard solutions.) In particular, since I'm not doing any server-side HTML generation, is there an easy, semi-standard way of setting up my static HTML so that it contains code like <script src="js/vendors/jquery.js"></script> <script src="js/class_a.js"></script> <script src="js/class_b.js"></script> <script src="js/main.js"></script> at development time and <script src="http://ajax.googleapis.com/ajax/libs/jquery/1.8.2/jquery.min.js"></script> <script src="js/entire_app.min.js"></script> for release?

    Read the article

  • SEO for Country & Language Specific content.

    - by kecebongsoft
    Currently I am creating a website which has a common topic for an article, but it's going to be different content for each country, and also, each of that content will be provided in several languages. And this mechanism exists in most of the parts in the website. For example, I have an article about tax. This article has to be different for each country, for example china. And tax content for china should be written in china AND english language (for non china-speaker). What is the best URL pattern to handle this? What I've been thinking is, using a sub folder (/country-code/language-code/) such as: www.example.com/cn/cn/tax www.example.com/cn/en/tax Or using top level domain such as: www.example.cn/cn/tax www.example.cn/en/tax Or subdomain such as cn.example.com/cn/tax cn.example.com/en/tax I think I will not prefer the last option since I might need to use subdomain for other purpose. Which left only subfolder and TLDN. I've read some articles saying that TLDN is good for localized content (language-specific content), but in my case, my TLDN will also has english contents (for non local speaker) which is specific only to that particular country (also the purpose of this is to let people from other country easily search it through google). What is the best pattern to pick and why?.

    Read the article

  • Writing cross-platforms Types, Interfaces and Classes/Methods in C++

    - by user827992
    I'm looking for the best solution to write cross-platform software, aka code that I write and that I have to interface with different libraries and platforms each time. What I consider the easiest part, correct me if I'm wrong, is the definition of new types, all I have to do is to write an hpp file with a list of typedefs, I can keep the same names for each new type across the different platforms so my codebase can be shared without any problem. typedefs also helps me to redefine a better scope for my types in my code. I will probably end up having something like this: include |-windows | |-types.hpp |-linux | |-types.hpp |-mac |-types.hpp For the interfaces I'm thinking about the same solution used for the types, a series of hpp files, probably I will write all the interfaces only once since they rely on the types and all "cross-platform portability" is ensured by the work done on the types. include | |-interfaces.hpp | |-windows | |-types.hpp |-linux | |-types.hpp |-mac | |-types.hpp For classes and methods I do not have a real answer, I would like to avoid 2 things: the explicit use of pointers the use of templates I want to avoid the use of the pointers because they can make the code less readable for someone and I want to avoid templates just because if I write them, I can't separate the interface from the definition. What is the best option to hide the use of the pointers? I would also like some words about macros and how to implement some OS-specifics calls and definitions.

    Read the article

  • SCVMM upgrade scenario

    - by pigeon
    I've read some information on TechNet about upgrading SCVMM 2008 - 2012 but can't quite figure out the best way to approach this. The current setup is that we've got SCVMM 2008 R2 installed but against best practice it was actually installed on the Hyper-V host machine since its a small scale deployment its just a single server setup with SCVMM existing on the same host rather than be in a VM. So from what I've read an in-place should be possible which will incur a restart but also don't have the luxury of another server to shift the VMs onto whilst doing this or want to risk anything happening to the Hyper-V role. Ideally I would probably prefer just to get SCVMM 2012 into a VM of its own and remove the 2008 version from the host machine. Anyone done an upgrade on this or have any recommendations about how to approach this?

    Read the article

  • FoxTales: Behind the Scenes at Fox Software by Kerry Nietz

    Flash backs from the past! It's truly amazing to discover that software development from freshman to senior level as well as project management hasn't changed that much. Kerry Nietz describes his memoir from his final year at college to his first job at Fox Software to 'an early retirement' at Microsoft. This title also brought his other fictional novels to my attention. Once again here is the review I published on Amazon: Built to last! I have been around in software development for more than a decade now but honestly I have to admit it is only now that I took the opportunity to read about the history of my used to be primary programming language. In fact, I started with Visual FoxPro 6 back in 1999 and went only down to FoxPro for Windows 2.6 during migration projects - long after the stories described in this title. It is really interesting to see how they actually managed to create a great product with such a small team of developers. "Create the best Report Writer in the world, out of only sawdust, bubblegum, and dreams." - That's the best sentence I'm going to quote from this title in the future. An inspiration to achieve the impossible, only by taking small steps. Just begin the journey - one step after the next one. If you fall, stand up and continue to walk. Kerry takes the reader on an amazing trip through almost 4 years working at a small software company in Perrysburg, Ohio. That went from a another 'look-alike' of the mighty Ashton-Tate dBase to the leading force in database development, long before Microsoft Access (project name: Cirrus) was even finished. It survived Borland Paradox and even nowadays Visual FoxPro is still in daily use in thousands of companies world-wide. Actually, I'm glad that I had the chance to foster my programming knowledge with Visual FoxPro. After his excellent work in software development, Kerry went for a second career as a writer. I'm looking forward to read his other titles soon:

    Read the article

< Previous Page | 364 365 366 367 368 369 370 371 372 373 374 375  | Next Page >