Search Results

Search found 24226 results on 970 pages for 'team foundation build'.

Page 236/970 | < Previous Page | 232 233 234 235 236 237 238 239 240 241 242 243  | Next Page >

  • [Update] RedGate SQL Source Control and TFSPreview

    - by andyleonard
    31 Oct 2012 Update: SQL Source Control 3.1 is available! - Andy 12 Oct 2012 Update: The SQL Source Control 3.1 update is currently unavailable. I will provide additional updates when this version is re-released. - Andy I am excited that RedGate ’s SQL Source Control now supports connectivity to TFSPreview , Microsoft ’s cloud-based Application Life Cycle Management portal. Buck Woody ( Blog | @buckwoody ) and I have written about TFSPreview at SQLBlog already: Team Foundation Server (TFS) in the...(read more)

    Read the article

  • Available OPN Specialization Categories & Specializations

    - by [email protected]
    DatabaseOracle Data Warehousing Specialization CriteriaOracle Database 11g Specialization CriteriaOracle Enterprise Linux Specialization CriteriaOracle Enterprise Manager Specialization Criteria MiddlewareOracle Service-Oriented Architecture Specialization CriteriaOracle Business Intelligence Foundation Specialization CriteriaOracle Enterprise ManagerOracle Enterprise Manager Specialization CriteriaOracle Enterprise LinuxOracle Enterprise Linux Specialization CriteriaAll Available Specializationshttps://competencycenter.oracle.com/opncc/glp_list.cc

    Read the article

  • Problems installing Moose on Mac (compile Sub::Name prereq)

    - by Leonard
    I'm trying to install Moose (a CPAN module) on my Macbook Pro. It finds a dependency on Sub-Name, and when it tries to install this, gets the following error messages. Any idea as to how I can cure this? XMATH/Sub-Name-0.04.tar.gz^M /usr/bin/make -- OK^M Running make test^M PERL_DL_NONLAZY=1 /opt/local/bin/perl "-MExtUtils::Command::MM" "-e" "test_harness(0, 'blib/lib', 'blib/arch')" t/*.t^M t/smoke....Can't load '/private/var/root/.cpan/build/Sub-Name-0.04-ziHbmm/blib/arch/auto/Sub/Name/Name.bundle' for module Sub::Name: dlopen(/private/var/root/.cpan/build/Sub-Name-0.04-ziHbmm/blib/arch/auto/Sub/Name/Name.bundle, 2): no suitable image found. Did find:^M /private/var/root/.cpan/build/Sub-Name-0.04-ziHbmm/blib/arch/auto/Sub/Name/Name.bundle: mach-o, but wrong architecture at /opt/local/lib/perl5/5.8.9/darwin-2level/DynaLoader.pm line 230.^M at t/smoke.t line 6^M Compilation failed in require at t/smoke.t line 6.^M

    Read the article

  • LLVM: bitcode with llvm-gcc (mingw) for windows

    - by TheShow
    Hi, i'm currently building a small JIT compiler. For the language I need a runtime library for some special math functions. I think the best would be to compile the lib to bitcode and link it. The compiler should be integrated in a product and as of this, it must work under windows (VC10, 64bit). So is it possible to build the math lib with the mingw llvm-gcc build an link it later with the JITed Code? Or are there any problems regarding the portability of the bitcode build with llvm-gcc under mingw? If there are problems, what solution would you suggest?

    Read the article

  • Software to create a knowledge base/FAQ system

    - by H1Man
    Our company is looking to build a web-based knowledge base system that can be used by our clients/end users to reduce the amount of support calls. Couple important notes: This is aimed at our end users, in other words, non-techies. So the UI has to be easy to use Should have the excellent (fast, accurate) search Should have ability to rate and comment on articles This will only be maintained by one or 2 people, so security isn't a big concern Something similar to what Microsoft is doing with their Knowledge Base. http://support.microsoft.com/search/ Does anyone have any recommendations on what software I can use? Thanks, H. Edit: I should have made this clear before but I don't mean build as in having our developers build a support/kb system from the ground up. I am looking to use a existing software package/solution that can be used to implement a knowledge base/support site.

    Read the article

  • How can i bundle other files when using cx_freeze?

    - by Mridang Agarwalla
    I'm using Python 2.6 and cx_Freeze 4.1.2 on a Windows system. I've created the setup.py to build my executable and everything works fine. When cx_Freeze runs it movies everything to the build directory. I have some other files that i would like included in my build directory. How can i do this? Here's my structure. src\ setup.py janitor.py README.txt CHNAGELOG.txt helpers\ uncompress\ unRAR.exe unzip.exe Here's my snippet: setup ( name='Janitor', version='1.0', description='Janitor', author='John Doe', author_email='[email protected]', url='http://www.this-page-intentionally-left-blank.org/', data_files = [ ('helpers\uncompress', ['helpers\uncompress\unzip.exe']), ('helpers\uncompress', ['helpers\uncompress\unRAR.exe']), ('', ['README.txt']) ], executables = [ Executable\ ( 'janitor.py', #initScript ) ] ) I can't seem to get this to work. Do i need a MANIFEST.in file? Thank you.

    Read the article

  • TFS and Project Integration

    - by Enrique Lima
    Recently there have been more and more requests on how to have TFS 2010 and Project 2010 together. Most of the requests have been around working with Agile and Scrum projects and templates. There are some guidance documents that have become available and also labs and Virtual Machine configurations to work with the different scenarios. TechNet Virtual Lab: Microsoft Enterprise Project Management - Project and Portfolio Management with Project 2010 Announcing Visual Studio Team Foundation Server 2010 and Project Server Integration Feature Pack Beta http://code.msdn.microsoft.com/P2010Scrum

    Read the article

  • Converting projects to use Automatic NuGet restore

    - by terje
    Originally posted on: http://geekswithblogs.net/terje/archive/2014/06/11/converting-projects-to-use-automatic-nuget-restore.aspxDownload tool In version 2.7 of NuGet automatic nuget restore was introduced, meaning you no longer need to distort your msbuild project files with nuget target information.   Visual Studio and TFS 2013 build have this enabled by default.  However, if your project was created before this was introduced, and/or if you have used the “Enable NuGet Package Restore” afterwards, you now have a series of unwanted things in your projects, and a series of project files that have been modified – and – you no longer neither want nor need this !  You might also get into some unwanted issues due to these modifications.  This is a MSBuild modification that was needed only before NuGet 2.7 ! So: DON’T USE THIS FUNCTION !!! There is an issue https://nuget.codeplex.com/workitem/4019 on this on the NuGet project site to get this function removed, renamed or at least moved farther away from the top level (please help vote it up!).  The response seems to be that it WILL BE removed, around version 3.0. This function does nothing you need after the introduction of NuGet 2.7.  What is also unfortunate is the naming of it – it implies that it is needed, it is not, and what is worse, there is no corresponding function to remove what it does ! So to fix this use the tool named IFix, that will fix this issue for you   - all free of course, and the code is open source.  Also report issues there:  https://github.com/OsirisTerje/IFix    IFix information DOWNLOAD HERE This command line tool installs using an MSI, and add itself to the system path.  If you work in a team, you will probably need to use the  tool multiple times.  Anyone in the team may at any time use the “Enable NuGet Package Restore” function and mess up your project again.  The IFix program can be run either in a  check modus, where it does not write anything back – it only checks if you have any issues, or in a Fix mode, where it will also perform the necessary fixes for you. The IFix program is used like this: IFix <command> [-c/--check] [-f/--fix]  [-v/--verbose] The command in this case is “nugetrestore”.  It will do a check from the location where it is being called, and run through all subfolders from that location. So  “IFix nugetrestore  --check” , will do the check ,  and “IFix nugetrestore  --fix”  will perform the changes, for all files and folders below the current working directory. (Note that --check  can be replaced with only –c, and --fix with –f, and so on. ) BEWARE: When you run the fix option, all solutions to be affected must be closed in Visual Studio ! So, if you just want to DO it, then: IFix nugetrestore --check to see if you have issues then IFix nugetrestore  --fix to fix them. How does it work IFix nugetrestore  checks and optionally fixes four issues that the older enabling of nuget restore did.  The issues are related to the MSBuild projess, and are: Deleting the nuget.targets file. Deleting the nuget.exe that is located under the .nuget folder Removing all references to nuget.targets in the solution file Removing all properties and target imports of nuget.targets inside the csproj files. IFix fixes these issues in the same sequence. The first step, removing the nuget.targets file is the most critical one, and all instances of the nuget.targets file within the scope of a solution has to be removed, and in addition it has to be done with the solution closed in Visual Studio.  If Visual Studio finds a nuget.targets file, the csproj files will be automatically messed up again. This means the removal process above might need to be done multiple times, specially when you’re working with a team, and that solution context menu still has the “Enable NuGet Package Restore” function.  Someone on the team might inadvertently do this at any time. It can be a good idea to add this check to a checkin policy – if you run TFS standard version control, but that will have no effect if you use TFS Git version control of course. So, better be prepared to run the IFix check from time to time. Or, even better, install IFix on your build servers, and add a call to IFix nugetrestore --check in the TFS Build script.    How does it look As a first example I have run the IFix program from the top of a set of git repositories, so it spans multiple repositories with multiple solutions. The result from the check option is as follows: We see the four red lines, there is one for each of the four checks we talked about in the previous section. The fact that they are red, means we have that particular issue. The first section (above the first red text line) is the nuget targets section.  Notice  No.1, it says it has found no paths to copy.  What IFix does here is to check if there are any defined paths to other nuget galleries.  If there are, then those are copied over to the nuget.config file, where is where it should be in version 2.7 and above.   No.2 says it has found the particular nuget.targets file,  No.3  states it HAS found some other nuget galleries defines in the targets file, which then it would like to copy to the config.file. No.4 is the section for nuget.exe files, and list those it has found, and which it would like to delete. No 5 states it has found a reference to nuget.targets in the solution file.  This reference comes from the fact that the .nuget folder is a solution folder, and the items within are described in the solution file. It then checks the csproj files, and as can be seen from the last red line, it ha found issues in 96 out of 198 csproj files.  There are two possible issues in a csproj files.  No.6 is the first one, and the most common and most important one, an “Import project” section.  This is the section that calls the nuget.targets files.  No.7 is another issue, which seems to sometimes be there, sometimes not, it is a RestorePackages property, which also should go away. Now, if we run the IFix nugetrestore –fix command, and then the check again after that, the result is: All green !

    Read the article

  • Best practices for large solutions in Visual Studio (2008)

    - by Eyvind
    We have a solution with around 100+ projects, most of them C#. Naturally, it takes a long time to both open and build, so I am looking for best practices for such beasts. Along the lines of questions I am hoping to get answers to, are: how do you best handle references between projects should "copy local" be on or off? should every project build to its own folder, or should they all build to the same output folder(they are all part of the same application) are solutions folders a good way of organizing stuff? I know that splitting the solution up into multiple smaller solutions is an option, but that comes with its own set of refactoring and building headaches, so perhaps we can save that for a separate thread :-)

    Read the article

  • Building a maven child project that depends on another projects child project with Bamboo

    - by kosoant
    I have two maven projects Project AAA * AAA-Core * AAA-Other Project BBB * BBB-Core * BBB-AAA-specific I want to create a build plan in Bamboo to build the BBB-AAA-specific project. The plan configuration is such that this project depends on the AAA-Other projec build. Thus everything should work ok. But when I try to run the BBB-AAA-specific Bamboo plan I get an error that states: "Unable to find resource 'foo.bar.AAA:AAA:pom:0.0.1-SNAPSHOT' in repository snapshots (http://foo.bar.com)" What is going on? The bamboo builds for "AAA-Core" and "AAA-Other" work as expected.

    Read the article

  • BenkoTips Live and On Demand: Visual Studio 2010, Silverlight 4, and WCF (Level 200)

    In this webcast, we explore what's new and possible with Windows Communication Foundation (WCF) RIA Services and your Microsoft Silverlight application. We show how you can create an entity model and then expose it to your client application and how to build a compelling interface using the data-binding features built into Microsoft Visual Studio 2010....Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Porting .NET 3.5 app to Linux

    - by gregarobinson
    Started doing research on porting a .NET 3.5 Windows Service application to Linux last week and today.There is a WinForms maintenance\admin tool that needs to be ported too. Looking at Mono and related friends. Need to come up with a way to support MSMQ and Windows Workflow Foundation. Looks like the support for this in Mono is in alpha and not stable yet.   

    Read the article

  • MacPorts on Snow Leopard: Python install seems to succeed but doesn't install a non-system Python

    - by thebossman
    I've installed Python via MacPorts. According to this question, the files in /opt/local/bin should run the "correct" Python version. However, all those files are symlinks to: /opt/local/Library/Frameworks/Python.framework/Versions/2.6/bin/ Running them directly from that folder (using no symlinks) runs an Apple build of Python! Python 2.6.6 (r266:84292, Jan 6 2011, 13:25:25) [GCC 4.2.1 (Apple Inc. build 5664)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> For comparison, running /usr/bin/python shows a slightly different version: Python 2.6.1 (r261:67515, Jun 24 2010, 21:47:49) [GCC 4.2.1 (Apple Inc. build 5646)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> They're both Apple builds! How do I install the correct version?

    Read the article

  • Visual Studio msbuild

    - by user62958
    I have a question regarding the commandline options of msbuild. I am currently using msbuild to build projects using the existing solution files. These solution files have references to external dll which have different paths on each machine. I am currently writing a build script and passing the specific path to the project file via the /p: switch of msbuild. My current build line is: msbuild test.sln /p:ReferencePath="c:\abc" /p:ReferencePath="c:\rca" What i have noticed that Reference Path now contains only c:\rca and not c:\abc. this is causing problems for me since, the external dlls lie in two different directorys. I am allowed to keep multiple reference paths via visual studio, but not via the commandline. Is there any known way by which i can do this

    Read the article

  • How to configure encoding in maven

    - by Ethan Leroy
    When I run maven install on my multi module maven project I always get the following output: [WARNING] File encoding has not been set, using platform encoding UTF-8, i.e. build is platform dependent! So, I googled around a bit, but all I can find is that I have to add <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> </properties> to my pom.xml. But it's already there (in the parent pom.xml). Configuring <encoding> for the maven-resources-plugin or the maven-compiler-plugin also doesn't fix it. So what's the problem?

    Read the article

  • Visual Studio 2012 not building dependent projects

    - by user1438940
    I just upgraded a VS2010 project to VS2012 and am now having a problem where dependent projects are not building on demand. For instance, say I have the following projects in my solution: Library A ConsoleApp 1 Where ConsoleApp 1 references Library A. If I change the signature of a method in a class in Library A and run ConsoleApp 1, there will be a compiler error due to ConsoleApp 1 not seeing my changes because running ConsoleApp 1 did NOT cause Library A to build. If I manually build Library A, then manually build ConsoleApp 1, it works fine. However, I would expect that running ConsoleApp 1 should cause any dependent projects to be rebuilt before launching. Could I have something configured incorrectly? Or is this a bug in VS2012?

    Read the article

  • Building a Java EE app on Mac OS X Snow Leopard for Glassfish 3

    - by Simon
    I'm having a bit of a problem building a Java Enterprise Edition web application on Mac OS X 10.6.2 using Ant 1.7.1, Glassfish v3 and Java EE 6. The problem is that the build process does not find the Java EE libraries which fair enough as I don't think Apple supply them with the default Java installation but I know they exist in the Glassfish distribution. Which jars are the correct ones to build against (I'm assuming javaee.jar is a general jar which references all the other needed jars) and what should I be putting in my ant build.xml file? Any help is very much appreciated.

    Read the article

  • RedGate SQL Source Control and TFSPreview

    - by andyleonard
    I am excited that RedGate ’s SQL Source Control now supports connectivity to TFSPreview ,  Microsoft ’s cloud-based Application Life Cycle Management portal. Buck Woody ( Blog | @buckwoody ) and I have written about TFSPreview at SQLBlog already: Team Foundation Server (TFS) in the Cloud - My Experience So Far (Buck) Introducing TFSPreview: Application Lifecycle Management in the Cloud! Using TFSPreview: Step 1, Connecting Microsoft’s commitment to cloudtech is strong and producing very cool...(read more)

    Read the article

  • Some notes from the Collaboration Summit

    <b>LWN.net:</b> "Your editor has just returned from the Linux Foundation's annual Collaboration Summit, held in San Francisco. LFCS is a unique event; despite becoming more developer-heavy over the years, it still pulls together an interesting combination of people from the wider Linux ecosystem."

    Read the article

  • Cisco PrecisionHD USB Camera (Detected but no video)

    - by Marcel Bissonnette
    I'm having an issue with my Cisco PrecisionHD USB Camera. It is detected but does not show video (using "Cheese Webcam Booth"). Any ideas? (I'm a newbie with ubuntu) Specs: Ubuntu 12.10 32bit Dell Latitude E6400 Cisco PrecisionHD USB Camera (USB 2.0) connected directly into laptop (no docking station) Troubleshooting: Using the command lsusb, I find the following device: Bus 002 Device 003: ID 1f82:0001 TANDBERG PrecisionHD Camera '+ more devices like audio, finger swipe, Linux foundation 2.0, 1.1 So what now? Thanks. MB

    Read the article

  • Documenting Business Processes and Capturing Organizational Knowledge with Oracle Tutor 12.2

    Organizations can master the challenges of documenting business processes and capturing organizational knowledge with Oracle Tutor. They can also solve the documentation challenges they face during an implementation/upgrade and satisfy business process regulatory compliance initiatives. Oracle Tutor can help project teams lay the foundation for a successful application rollout or compliance audit by quickly and consistently creating and sustaining employee process documentation throughout the business lifecycle.

    Read the article

  • Why not use development provisioning instead of ad hoc?

    - by lawrence
    I was under the impression that when you use a development provisioning profile for a build of an app, only the specified developers can deploy that build to a phone. But I just deployed a build that uses a development profile to a phone using Xcode Organizer, even though I'm not one of the valid developers for that profile. One of my colleagues, also not a valid developer, did the same with his phone using iTunes. In that case, why not use a development provisioning profile for distributing your app to e.g. your QA team, instead of ad hoc distribution?

    Read the article

  • Oracle Tutor: Top 10 to Implement Sustainable Policies and Procedures

    - by emily.chorba(at)oracle.com
    Overview Your organization (executives, managers, and employees) understands the value of having written business process documents (process maps, procedures, instructions, reference documents, and form abstracts). Policies and procedures should be documented because they help to reduce the range of individual decisions and encourage management by exception: the manager only needs to give special attention to unusual problems, not covered by a specific policy or procedure. As more and more procedures are written to cover recurring situations, managers will begin to make decisions which will be consistent from one functional area to the next.Companies should take a project management approach when implementing an environment for a sustainable documentation program and do the following:1. Identify an Executive Champion2. Put together a winning team3. Assign ownership4. Centralize publishing5. Establish the Document Maintenance Process Up Front6. Document critical activities only7. Document actual practice8. Minimize documentation9. Support continuous improvement10. Keep it simple 1. Identify an Executive ChampionAppoint a top down driver. Select one key individual to be a mentor for the procedure planning team. The individual should be a senior manager, such as your company president, CIO, CFO, the vice-president of quality, manufacturing, or engineering. Written policies and procedures can be important supportive aids when known to express the thinking for the chief executive officer and / or the president and to have his or her full support. 2. Put Together a Winning TeamChoose a strong Project Management Leader and staff the procedure planning team with management members from cross functional groups. Make sure team members have the responsibility - and the authority - to make things happen.The winning team should consist of the Documentation Project Manager, Document Owners (one for each functional area), a Document Controller, and Document Specialists (as needed). The Tutor Implementation Guide has complete job descriptions for these roles. 3. Assign Ownership It is virtually impossible to keep process documentation simple and meaningful if employees who are far removed from the activity itself create it. It is impossible to keep documentation up-to-date when responsibility for the document is not clearly understood.Key to the Tutor methodology, therefore, is the concept of ownership. Each document has a single owner, who is responsible for ensuring that the document is necessary and that it reflects actual practice. The owner must be a person who is knowledgeable about the activity and who has the authority to build consensus among the persons who participate in the activity as well as the authority to define or change the way an activity is performed. The owner must be an advocate of the performers and negotiate, not dictate practices.In the Tutor environment, a document's owner is the only person with the authority to approve an update to that document. 4. Centralize Publishing Although it is tempting (especially in a networked environment and with document management software solutions) to decentralize the control of all documents -- with each owner updating and distributing his own -- Tutor promotes centralized publishing by assigning the Document Administrator (gate keeper) to manage the updates and distribution of the procedures library. 5. Establish a Document Maintenance Process Up Front (and stick to it) Everyone in your organization should know they are invited to suggest changes to procedures and should understand exactly what steps to take to do so. Tutor provides a set of procedures to help your company set up a healthy document control system. There are many document management products available to automate some of the document change and maintenance steps. Depending on the size of your organization, a simple document management system can reduce the effort it takes to track and distribute document changes and updates. Whether your company decides to store the written policies and procedures on a file server or in a database, the essential tasks for maintaining documents are the same, though some tasks are automated. 6. Document Critical Activities Only The best way to keep your documentation simple is to reduce the number of process documents to a bare minimum and to include in those documents only as much detail as is absolutely necessary. The first step to reducing process documentation is to document only those activities that are deemed critical. Not all activities require documentation. In fact, some critical activities cannot and should not be standardized. Others may be sufficiently documented with an instruction or a checklist and may not require a procedure. A document should only be created when it enhances the performance of the employee performing the activity. If it does not help the employee, then there is no reason to maintain the document. Activities that represent little risk (such as project status), activities that cannot be defined in terms of specific tasks (such as product research), and activities that can be performed in a variety of ways (such as advertising) often do not require documentation. Sometimes, an activity will evolve to the point where documentation is necessary. For example, an activity performed by single employee may be straightforward and uncomplicated -- that is, until the activity is performed by multiple employees. Sometimes, it is the interaction between co-workers that necessitates documentation; sometimes, it is the complexity or the diversity of the activity.7. Document Actual Practices The only reason to maintain process documentation is to enhance the performance of the employee performing the activity. And documentation can only enhance performance if it reflects reality -- that is, current best practice. Documentation that reflects an unattainable ideal or outdated practices will end up on the shelf, unused and forgotten.Documenting actual practice means (1) auditing the activity to understand how the work is really performed, (2) identifying best practices with employees who are involved in the activity, (3) building consensus so that everyone agrees on a common method, and (4) recording that consensus.8. Minimize Documentation One way to keep it simple is to document at the highest level possible. That is, include in your documents only as much detail as is absolutely necessary.When writing a document, you should ask yourself, What is the purpose of this document? That is, what problem will it solve?By focusing on this question, you can target the critical information.• What questions are the end users likely to have?• What level of detail is required?• Is any of this information extraneous to the document's purpose? Short, concise documents are user friendly and they are easier to keep up to date. 9. Support Continuous Improvement Employees who perform an activity are often in the best position to identify improvements to the process. In other words, continuous improvement is a natural byproduct of the work itself -- but only if the improvements are communicated to all employees who are involved in the process, and only if there is consensus among those employees.Traditionally, process documentation has been used to dictate performance, to limit employees' actions. In the Tutor environment, process documents are used to communicate improvements identified by employees. How does this work? The Tutor methodology requires a process document to reflect actual practice, so the owner of a document must routinely audit its content -- does the document match what the employees are doing? If it doesn't, the owner has the responsibility to evaluate the process, to build consensus among the employees, to identify "best practices," and to communicate these improvements via a document update. Continuous improvement can also be an outgrowth of corrective action -- but only if the solutions to problems are communicated effectively. The goal should be to solve a problem once and only once, which means not only identifying the solution, but ensuring that the solution becomes part of the process. The Tutor system provides the method through which improvements and solutions are documented and communicated to all affected employees in a cost-effective, timely manner; it ensures that improvements are not lost or confined to a single employee. 10. Keep it Simple Process documents don't have to be complex and unfriendly. In fact, the simpler the format and organization, the more likely the documents will be used. And the simpler the method of maintenance, the more likely the documents will be kept up-to-date. Keep it simply by:• Minimizing skills and training required• Following the established Tutor document format and layout• Avoiding technology just for technology's sake No other rule has as major an impact on the success of your internal documentation as -- keep it simple. Learn More For more information about Tutor, visit Oracle.Com or the Tutor Blog. Post your questions at the Tutor Forum.   Emily Chorba Principle Product Manager Oracle Tutor & BPM 

    Read the article

  • objective-c Derived class may not respond to base class method

    - by zadam
    Hi, I have derived from a 3rd party class, and when I attempt to call a method in the base class, I get the x may not respond to y compiler warning. How can I remove the warning? Repro: @interface ThirdPartyBaseClass : NSObject {} +(id)build; -(void)doStuff; @end @implementation ThirdPartyBaseClass +(id) build{ return [[[self alloc] init] autorelease]; } -(void)doStuff{ } @end @interface MyDerivedClass : ThirdPartyBaseClass {} +(id)buildMySelf; @end @implementation MyDerivedClass +(id)buildMySelf{ self = [ThirdPartyBaseClass build]; [self doStuff]; // compiler warning here - 'MyDerivedClass' may not respond to '+doStuff' return self; } @end Thanks!

    Read the article

  • Top Ten Things to Do Before Hiring a Web Developer

    Before approaching web developers for estimates on building your new business's site, there are a few things you should think through first so you are fully prepared for the questions you will be asked. Here's a list of ten things to be clear on before making that important next step: Be clear on your business plan. This may sound obvious, but it has happened where I've been asked to build a website when the potential client only had an idea of what they wanted and no business foundation planned out at all.

    Read the article

< Previous Page | 232 233 234 235 236 237 238 239 240 241 242 243  | Next Page >