Merge remote-tracking branch 'origin/v0.10'
authorTrevor Norris <trev.norris@gmail.com>
Wed, 15 Jan 2014 21:49:55 +0000 (13:49 -0800)
committerTrevor Norris <trev.norris@gmail.com>
Wed, 15 Jan 2014 21:49:55 +0000 (13:49 -0800)
Conflicts:
lib/domain.js

12 files changed:
doc/blog/Uncategorized/tj-fontaine-new-node-lead.md [new file with mode: 0644]
lib/domain.js
test/simple/test-domain-safe-exit.js [new file with mode: 0644]
tools/gyp/AUTHORS
tools/gyp/pylib/gyp/MSVSVersion.py
tools/gyp/pylib/gyp/generator/make.py
tools/gyp/pylib/gyp/generator/msvs.py
tools/gyp/pylib/gyp/generator/ninja.py
tools/gyp/pylib/gyp/mac_tool.py
tools/gyp/pylib/gyp/msvs_emulation.py
tools/gyp/pylib/gyp/ordered_dict.py
tools/gyp/pylib/gyp/win_tool.py

diff --git a/doc/blog/Uncategorized/tj-fontaine-new-node-lead.md b/doc/blog/Uncategorized/tj-fontaine-new-node-lead.md
new file mode 100644 (file)
index 0000000..88c80be
--- /dev/null
@@ -0,0 +1,54 @@
+title: The Next Phase of Node.js
+date: Wed Jan 15 09:00:00 PST 2014
+author: Isaac Z. Schlueter
+slug: the-next-phase-of-node-js
+
+Node's growth has continued and accelerated immensely over the last
+few years.  More people are developing and sharing more code with Node
+and npm than I would have ever imagined.  Countless companies are
+using Node, and npm along with it.
+
+Over the last year, [TJ Fontaine](https://twitter.com/tjfontaine) has become absolutely essential to the
+Node.js project.  He's been building releases, managing the test bots,
+[fixing nasty
+bugs](http://www.joyent.com/blog/walmart-node-js-memory-leak) and
+making decisions for the project with constant focus on the needs of
+our users.  He was responsible for an update to MDB to [support
+running ::findjsobjects on Linux core
+dumps](http://www.slideshare.net/bcantrill/node-summit2013), and is
+working on a shim layer that will provide a stable C interface for
+Node binary addons.  In partnership with Joyent and The Node Firm,
+he's helped to create a path forward for scalable issue triaging.
+He's become the primary point of contact keeping us all driving the
+project forward together.
+
+Anyone who's been close to the core project knows that he's been
+effectively leading the project for a while now, so we're making it
+official.  Effective immediately, TJ Fontaine is the Node.js project
+lead.  I will remain a Node core committer, and expect to continue to
+contribute to the project in that role.  My primary focus, however,
+will be npm.
+
+At this point, npm needs work, and I am eager to deliver what the Node
+community needs from its package manager.  I am starting a company,
+npm, Inc., to deliver new products and services related to npm.  I'll
+be sharing many more details soon about exactly how this is going to
+work, and what we'll be offering.  For now, suffice it to say that
+everything currently free will remain free, and everything currently
+flaky will get less flaky.  Pursuing new revenue is how we can keep
+providing the npm registry service in a long-term sustainable way, and
+it has to be done very carefully so that we don't damage what we've
+all built together.
+
+npm is what I'm most passionate about, and I am now in a position to
+give it my full attention.  I've done more than I could have hoped to
+accomplish in running Node core, and it's well past time to hand the
+control of the project off to its next gatekeeper.
+
+TJ is exactly the leader who can help us take Node.js to 1.0 and
+beyond.  He brings professionalism, rigor, and a continued focus on
+inclusive community values and culture.  In the coming days, TJ will
+spell out his plans in greater detail.  I look forward to the places
+that Node will go with his guidance.
+
+Please join me in welcoming him to this new role :)
index ae9c501..bbedcba 100644 (file)
@@ -136,14 +136,13 @@ Domain.prototype.enter = function() {
 
 
 Domain.prototype.exit = function() {
-  if (this._disposed) return;
+  // skip disposed domains, as usual, but also don't do anything if this
+  // domain is not on the stack.
+  var index = stack.lastIndexOf(this);
+  if (this._disposed || index === -1) return;
 
   // exit all domains until this one.
-  var index = stack.lastIndexOf(this);
-  if (index !== -1)
-    stack.splice(index);
-  else
-    stack.length = 0;
+  stack.splice(index);
   _domain_flag[0] = stack.length;
 
   exports.active = stack[stack.length - 1];
diff --git a/test/simple/test-domain-safe-exit.js b/test/simple/test-domain-safe-exit.js
new file mode 100644 (file)
index 0000000..a7dcef0
--- /dev/null
@@ -0,0 +1,36 @@
+// Copyright Joyent, Inc. and other Node contributors.
+//
+// Permission is hereby granted, free of charge, to any person obtaining a
+// copy of this software and associated documentation files (the
+// "Software"), to deal in the Software without restriction, including
+// without limitation the rights to use, copy, modify, merge, publish,
+// distribute, sublicense, and/or sell copies of the Software, and to permit
+// persons to whom the Software is furnished to do so, subject to the
+// following conditions:
+//
+// The above copyright notice and this permission notice shall be included
+// in all copies or substantial portions of the Software.
+//
+// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
+// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
+// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
+// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
+// USE OR OTHER DEALINGS IN THE SOFTWARE.
+
+
+// Make sure the domain stack doesn't get clobbered by un-matched .exit()
+
+var assert = require('assert');
+var domain = require('domain');
+
+var a = domain.create();
+var b = domain.create();
+
+a.enter(); // push
+b.enter(); // push
+assert.deepEqual(domain._stack, [a, b], 'b not pushed');
+
+domain.create().exit(); // no-op
+assert.deepEqual(domain._stack, [a, b], 'stack mangled!');
index 234e148..9389ca0 100644 (file)
@@ -7,4 +7,5 @@ Yandex LLC
 
 Steven Knight <knight@baldmt.com>
 Ryan Norton <rnorton10@gmail.com>
+David J. Sankel <david@sankelsoftware.com>
 Eric N. Vander Weele <ericvw@gmail.com>
index bb30a7b..03b6d8a 100644 (file)
@@ -96,9 +96,11 @@ class VisualStudioVersion(object):
       else:
         assert target_arch == 'x64'
         arg = 'x86_amd64'
-        if (os.environ.get('PROCESSOR_ARCHITECTURE') == 'AMD64' or
+        # Use the 64-on-64 compiler if we're not using an express
+        # edition and we're running on a 64bit OS.
+        if self.short_name[-1] != 'e' and (
+            os.environ.get('PROCESSOR_ARCHITECTURE') == 'AMD64' or
             os.environ.get('PROCESSOR_ARCHITEW6432') == 'AMD64'):
-          # Use the 64-on-64 compiler if we can.
           arg = 'amd64'
         return [os.path.normpath(
             os.path.join(self.path, 'VC/vcvarsall.bat')), arg]
index 8fb856f..b88a433 100644 (file)
@@ -1951,7 +1951,8 @@ def GenerateOutput(target_list, target_dicts, data, params):
     # We write the file in the base_path directory.
     output_file = os.path.join(options.depth, base_path, base_name)
     if options.generator_output:
-      output_file = os.path.join(options.generator_output, output_file)
+      output_file = os.path.join(
+          options.depth, options.generator_output, base_path, base_name)
     base_path = gyp.common.RelativePath(os.path.dirname(build_file),
                                         options.toplevel_dir)
     return base_path, output_file
@@ -1974,7 +1975,8 @@ def GenerateOutput(target_list, target_dicts, data, params):
   makefile_path = os.path.join(options.toplevel_dir, makefile_name)
   if options.generator_output:
     global srcdir_prefix
-    makefile_path = os.path.join(options.generator_output, makefile_path)
+    makefile_path = os.path.join(
+        options.toplevel_dir, options.generator_output, makefile_name)
     srcdir = gyp.common.RelativePath(srcdir, options.generator_output)
     srcdir_prefix = '$(srcdir)/'
 
@@ -2094,7 +2096,8 @@ def GenerateOutput(target_list, target_dicts, data, params):
 
     this_make_global_settings = data[build_file].get('make_global_settings', [])
     assert make_global_settings_array == this_make_global_settings, (
-        "make_global_settings needs to be the same for all targets.")
+        "make_global_settings needs to be the same for all targets. %s vs. %s" %
+        (this_make_global_settings, make_global_settings))
 
     build_files.add(gyp.common.RelativePath(build_file, options.toplevel_dir))
     included_files = data[build_file]['included_files']
index 4ca5716..d8e0872 100644 (file)
@@ -209,13 +209,14 @@ def _FixPaths(paths):
 
 
 def _ConvertSourcesToFilterHierarchy(sources, prefix=None, excluded=None,
-                                     list_excluded=True):
+                                     list_excluded=True, msvs_version=None):
   """Converts a list split source file paths into a vcproj folder hierarchy.
 
   Arguments:
     sources: A list of source file paths split.
     prefix: A list of source file path layers meant to apply to each of sources.
     excluded: A set of excluded files.
+    msvs_version: A MSVSVersion object.
 
   Returns:
     A hierarchy of filenames and MSVSProject.Filter objects that matches the
@@ -230,6 +231,7 @@ def _ConvertSourcesToFilterHierarchy(sources, prefix=None, excluded=None,
   if not prefix: prefix = []
   result = []
   excluded_result = []
+  folders = OrderedDict()
   # Gather files into the final result, excluded, or folders.
   for s in sources:
     if len(s) == 1:
@@ -238,10 +240,17 @@ def _ConvertSourcesToFilterHierarchy(sources, prefix=None, excluded=None,
         excluded_result.append(filename)
       else:
         result.append(filename)
+    elif msvs_version and not msvs_version.UsesVcxproj():
+      # For MSVS 2008 and earlier, we need to process all files before walking
+      # the sub folders.
+      if not folders.get(s[0]):
+        folders[s[0]] = []
+      folders[s[0]].append(s[1:])
     else:
       contents = _ConvertSourcesToFilterHierarchy([s[1:]], prefix + [s[0]],
                                                   excluded=excluded,
-                                                  list_excluded=list_excluded)
+                                                  list_excluded=list_excluded,
+                                                  msvs_version=msvs_version)
       contents = MSVSProject.Filter(s[0], contents=contents)
       result.append(contents)
   # Add a folder for excluded files.
@@ -249,6 +258,17 @@ def _ConvertSourcesToFilterHierarchy(sources, prefix=None, excluded=None,
     excluded_folder = MSVSProject.Filter('_excluded_files',
                                          contents=excluded_result)
     result.append(excluded_folder)
+
+  if msvs_version and msvs_version.UsesVcxproj():
+    return result
+
+  # Populate all the folders.
+  for f in folders:
+    contents = _ConvertSourcesToFilterHierarchy(folders[f], prefix=prefix + [f],
+                                                excluded=excluded,
+                                                list_excluded=list_excluded)
+    contents = MSVSProject.Filter(f, contents=contents)
+    result.append(contents)
   return result
 
 
@@ -971,8 +991,9 @@ def _GenerateMSVSProject(project, options, version, generator_flags):
                         actions_to_add)
   list_excluded = generator_flags.get('msvs_list_excluded_files', True)
   sources, excluded_sources, excluded_idl = (
-      _AdjustSourcesAndConvertToFilterHierarchy(
-          spec, options, project_dir, sources, excluded_sources, list_excluded))
+      _AdjustSourcesAndConvertToFilterHierarchy(spec, options, project_dir,
+                                                sources, excluded_sources,
+                                                list_excluded, version))
 
   # Add in files.
   missing_sources = _VerifySourcesExist(sources, project_dir)
@@ -1416,7 +1437,7 @@ def _PrepareListOfSources(spec, generator_flags, gyp_file):
 
 
 def _AdjustSourcesAndConvertToFilterHierarchy(
-    spec, options, gyp_dir, sources, excluded_sources, list_excluded):
+    spec, options, gyp_dir, sources, excluded_sources, list_excluded, version):
   """Adjusts the list of sources and excluded sources.
 
   Also converts the sets to lists.
@@ -1427,6 +1448,7 @@ def _AdjustSourcesAndConvertToFilterHierarchy(
     gyp_dir: The path to the gyp file being processed.
     sources: A set of sources to be included for this project.
     excluded_sources: A set of sources to be excluded for this project.
+    version: A MSVSVersion object.
   Returns:
     A trio of (list of sources, list of excluded sources,
                path of excluded IDL file)
@@ -1451,7 +1473,8 @@ def _AdjustSourcesAndConvertToFilterHierarchy(
   # Convert to folders and the right slashes.
   sources = [i.split('\\') for i in sources]
   sources = _ConvertSourcesToFilterHierarchy(sources, excluded=fully_excluded,
-                                             list_excluded=list_excluded)
+                                             list_excluded=list_excluded,
+                                             msvs_version=version)
 
   # Prune filters with a single child to flatten ugly directory structures
   # such as ../../src/modules/module1 etc.
@@ -3126,7 +3149,7 @@ def _GenerateMSBuildProject(project, options, version, generator_flags):
       _AdjustSourcesAndConvertToFilterHierarchy(spec, options,
                                                 project_dir, sources,
                                                 excluded_sources,
-                                                list_excluded))
+                                                list_excluded, version))
 
   # Don't add actions if we are using an external builder like ninja.
   if not spec.get('msvs_external_builder'):
index d3db2c8..7461814 100644 (file)
@@ -1037,12 +1037,13 @@ class NinjaWriter:
           self.GypPathToNinja, arch)
       ldflags = env_ldflags + ldflags
     elif self.flavor == 'win':
-      manifest_name = self.GypPathToUniqueOutput(
+      manifest_base_name = self.GypPathToUniqueOutput(
           self.ComputeOutputFileName(spec))
       ldflags, intermediate_manifest, manifest_files = \
           self.msvs_settings.GetLdflags(config_name, self.GypPathToNinja,
-                                        self.ExpandSpecial, manifest_name,
-                                        is_executable, self.toplevel_build)
+                                        self.ExpandSpecial, manifest_base_name,
+                                        output, is_executable,
+                                        self.toplevel_build)
       ldflags = env_ldflags + ldflags
       self.WriteVariableList(ninja_file, 'manifests', manifest_files)
       implicit_deps = implicit_deps.union(manifest_files)
@@ -1095,16 +1096,27 @@ class NinjaWriter:
       extra_bindings.append(('lib',
                             gyp.common.EncodePOSIXShellArgument(output)))
       if self.flavor == 'win':
-        extra_bindings.append(('dll', output))
+        extra_bindings.append(('binary', output))
         if '/NOENTRY' not in ldflags:
           self.target.import_lib = output + '.lib'
           extra_bindings.append(('implibflag',
                                  '/IMPLIB:%s' % self.target.import_lib))
+          pdbname = self.msvs_settings.GetPDBName(
+              config_name, self.ExpandSpecial, output + '.pdb')
           output = [output, self.target.import_lib]
+          if pdbname:
+            output.append(pdbname)
       elif not self.is_mac_bundle:
         output = [output, output + '.TOC']
       else:
         command = command + '_notoc'
+    elif self.flavor == 'win':
+      extra_bindings.append(('binary', output))
+      pdbname = self.msvs_settings.GetPDBName(
+          config_name, self.ExpandSpecial, output + '.pdb')
+      if pdbname:
+        output = [output, pdbname]
+
 
     if len(solibs):
       extra_bindings.append(('solibs', gyp.common.EncodePOSIXShellList(solibs)))
@@ -1545,7 +1557,10 @@ def GetDefaultConcurrentLinks():
 
     mem_limit = max(1, stat.ullTotalPhys / (4 * (2 ** 30)))  # total / 4GB
     hard_cap = max(1, int(os.getenv('GYP_LINK_CONCURRENCY_MAX', 2**32)))
-    return min(mem_limit, hard_cap)
+    # return min(mem_limit, hard_cap)
+    # TODO(scottmg): Temporary speculative fix for OOM on builders
+    # See http://crbug.com/333000.
+    return 2
   elif sys.platform.startswith('linux'):
     with open("/proc/meminfo") as meminfo:
       memtotal_re = re.compile(r'^MemTotal:\s*(\d*)\s*kB')
@@ -1591,33 +1606,35 @@ def _AddWinLinkRules(master_ninja, embed_manifest):
                'resname': resource_name,
                'embed': embed_manifest }
   rule_name_suffix = _GetWinLinkRuleNameSuffix(embed_manifest)
-  dlldesc = 'LINK%s(DLL) $dll' % rule_name_suffix.upper()
-  dllcmd = ('%s gyp-win-tool link-wrapper $arch '
-            '$ld /nologo $implibflag /DLL /OUT:$dll '
-            '/PDB:$dll.pdb @$dll.rsp' % sys.executable)
-  dllcmd = FullLinkCommand(dllcmd, '$dll', 'dll')
+  use_separate_mspdbsrv = (
+      int(os.environ.get('GYP_USE_SEPARATE_MSPDBSRV', '0')) != 0)
+  dlldesc = 'LINK%s(DLL) $binary' % rule_name_suffix.upper()
+  dllcmd = ('%s gyp-win-tool link-wrapper $arch %s '
+            '$ld /nologo $implibflag /DLL /OUT:$binary '
+            '@$binary.rsp' % (sys.executable, use_separate_mspdbsrv))
+  dllcmd = FullLinkCommand(dllcmd, '$binary', 'dll')
   master_ninja.rule('solink' + rule_name_suffix,
                     description=dlldesc, command=dllcmd,
-                    rspfile='$dll.rsp',
+                    rspfile='$binary.rsp',
                     rspfile_content='$libs $in_newline $ldflags',
                     restat=True,
                     pool='link_pool')
   master_ninja.rule('solink_module' + rule_name_suffix,
                     description=dlldesc, command=dllcmd,
-                    rspfile='$dll.rsp',
+                    rspfile='$binary.rsp',
                     rspfile_content='$libs $in_newline $ldflags',
                     restat=True,
                     pool='link_pool')
   # Note that ldflags goes at the end so that it has the option of
   # overriding default settings earlier in the command line.
-  exe_cmd = ('%s gyp-win-tool link-wrapper $arch '
-             '$ld /nologo /OUT:$out /PDB:$out.pdb @$out.rsp' %
-              sys.executable)
-  exe_cmd = FullLinkCommand(exe_cmd, '$out', 'exe')
+  exe_cmd = ('%s gyp-win-tool link-wrapper $arch %s '
+             '$ld /nologo /OUT:$binary @$binary.rsp' %
+              (sys.executable, use_separate_mspdbsrv))
+  exe_cmd = FullLinkCommand(exe_cmd, '$binary', 'exe')
   master_ninja.rule('link' + rule_name_suffix,
-                    description='LINK%s $out' % rule_name_suffix.upper(),
+                    description='LINK%s $binary' % rule_name_suffix.upper(),
                     command=exe_cmd,
-                    rspfile='$out.rsp',
+                    rspfile='$binary.rsp',
                     rspfile_content='$in_newline $libs $ldflags',
                     pool='link_pool')
 
@@ -1877,7 +1894,7 @@ def GenerateOutputForConfig(target_list, target_dicts, data, params,
     master_ninja.rule(
         'alink',
         description='LIB $out',
-        command=('%s gyp-win-tool link-wrapper $arch '
+        command=('%s gyp-win-tool link-wrapper $arch False '
                  '$ar /nologo /ignore:4221 /OUT:$out @$out.rsp' %
                  sys.executable),
         rspfile='$out.rsp',
@@ -2027,7 +2044,8 @@ def GenerateOutputForConfig(target_list, target_dicts, data, params,
 
     this_make_global_settings = data[build_file].get('make_global_settings', [])
     assert make_global_settings == this_make_global_settings, (
-        "make_global_settings needs to be the same for all targets.")
+        "make_global_settings needs to be the same for all targets. %s vs. %s" %
+        (this_make_global_settings, make_global_settings))
 
     spec = target_dicts[qualified_target]
     if flavor == 'mac':
index c61a3ef..ac19b6d 100755 (executable)
@@ -503,7 +503,8 @@ class MacTool(object):
     if isinstance(data, list):
       return [self._ExpandVariables(v, substitutions) for v in data]
     if isinstance(data, dict):
-      return {k: self._ExpandVariables(data[k], substitutions) for k in data}
+      return dict((k, self._ExpandVariables(data[k],
+                                            substitutions)) for k in data)
     return data
 
 if __name__ == '__main__':
index 3435bbc..6428fce 100644 (file)
@@ -317,15 +317,20 @@ class MsvsSettings(object):
           output_file, config=config))
     return output_file
 
-  def GetPDBName(self, config, expand_special):
-    """Gets the explicitly overridden pdb name for a target or returns None
-    if it's not overridden."""
+  def GetPDBName(self, config, expand_special, default):
+    """Gets the explicitly overridden pdb name for a target or returns
+    default if it's not overridden, or if no pdb will be generated."""
     config = self._TargetConfig(config)
     output_file = self._Setting(('VCLinkerTool', 'ProgramDatabaseFile'), config)
-    if output_file:
-      output_file = expand_special(self.ConvertVSMacros(
-          output_file, config=config))
-    return output_file
+    generate_debug_info = self._Setting(
+        ('VCLinkerTool', 'GenerateDebugInformation'), config)
+    if generate_debug_info:
+      if output_file:
+        return expand_special(self.ConvertVSMacros(output_file, config=config))
+      else:
+        return default
+    else:
+      return None
 
   def GetCflags(self, config):
     """Returns the flags that need to be added to .c and .cc compilations."""
@@ -454,7 +459,7 @@ class MsvsSettings(object):
     return output_file
 
   def GetLdflags(self, config, gyp_to_build_path, expand_special,
-                 manifest_base_name, is_executable, build_dir):
+                 manifest_base_name, output_name, is_executable, build_dir):
     """Returns the flags that need to be added to link commands, and the
     manifest files."""
     config = self._TargetConfig(config)
@@ -472,7 +477,7 @@ class MsvsSettings(object):
     out = self.GetOutputName(config, expand_special)
     if out:
       ldflags.append('/OUT:' + out)
-    pdb = self.GetPDBName(config, expand_special)
+    pdb = self.GetPDBName(config, expand_special, output_name + '.pdb')
     if pdb:
       ldflags.append('/PDB:' + pdb)
     pgd = self.GetPGDName(config, expand_special)
index a3609ad..a1e89f9 100644 (file)
@@ -166,6 +166,8 @@ class OrderedDict(dict):
         for k in self:
             yield (k, self[k])
 
+    # Suppress 'OrderedDict.update: Method has no argument':
+    # pylint: disable=E0211
     def update(*args, **kwds):
         '''od.update(E, **F) -> None.  Update od from dict/iterable E and F.
 
index 1634ff9..e9d7df0 100755 (executable)
@@ -18,9 +18,9 @@ import sys
 
 BASE_DIR = os.path.dirname(os.path.abspath(__file__))
 
-# A regex matching an argument corresponding to a PDB filename passed as an
-# argument to link.exe.
-_LINK_EXE_PDB_ARG = re.compile('/PDB:(?P<pdb>.+\.exe\.pdb)$', re.IGNORECASE)
+# A regex matching an argument corresponding to the output filename passed to
+# link.exe.
+_LINK_EXE_OUT_ARG = re.compile('/OUT:(?P<out>.+)$', re.IGNORECASE)
 
 def main(args):
   executor = WinTool()
@@ -33,25 +33,22 @@ class WinTool(object):
   """This class performs all the Windows tooling steps. The methods can either
   be executed directly, or dispatched from an argument list."""
 
-  def _MaybeUseSeparateMspdbsrv(self, env, args):
-    """Allows to use a unique instance of mspdbsrv.exe for the linkers linking
-    an .exe target if GYP_USE_SEPARATE_MSPDBSRV has been set."""
-    if not os.environ.get('GYP_USE_SEPARATE_MSPDBSRV'):
-      return
-
+  def _UseSeparateMspdbsrv(self, env, args):
+    """Allows to use a unique instance of mspdbsrv.exe per linker instead of a
+    shared one."""
     if len(args) < 1:
       raise Exception("Not enough arguments")
 
     if args[0] != 'link.exe':
       return
 
-    # Checks if this linker produces a PDB for an .exe target. If so use the
-    # name of this PDB to generate an endpoint name for mspdbsrv.exe.
+    # Use the output filename passed to the linker to generate an endpoint name
+    # for mspdbsrv.exe.
     endpoint_name = None
     for arg in args:
-      m = _LINK_EXE_PDB_ARG.match(arg)
+      m = _LINK_EXE_OUT_ARG.match(arg)
       if m:
-        endpoint_name = '%s_%d' % (m.group('pdb'), os.getpid())
+        endpoint_name = '%s_%d' % (m.group('out'), os.getpid())
         break
 
     if endpoint_name is None:
@@ -99,13 +96,14 @@ class WinTool(object):
     else:
       shutil.copy2(source, dest)
 
-  def ExecLinkWrapper(self, arch, *args):
+  def ExecLinkWrapper(self, arch, use_separate_mspdbsrv, *args):
     """Filter diagnostic output from link that looks like:
     '   Creating library ui.dll.lib and object ui.dll.exp'
     This happens when there are exports from the dll or exe.
     """
     env = self._GetEnv(arch)
-    self._MaybeUseSeparateMspdbsrv(env, args)
+    if use_separate_mspdbsrv == 'True':
+      self._UseSeparateMspdbsrv(env, args)
     link = subprocess.Popen(args,
                             shell=True,
                             env=env,
@@ -280,6 +278,11 @@ class WinTool(object):
     """Runs an action command line from a response file using the environment
     for |arch|. If |dir| is supplied, use that as the working directory."""
     env = self._GetEnv(arch)
+    # TODO(scottmg): This is a temporary hack to get some specific variables
+    # through to actions that are set after gyp-time. http://crbug.com/333738.
+    for k, v in os.environ.iteritems():
+      if k not in env:
+        env[k] = v
     args = open(rspfile).read()
     dir = dir[0] if dir else None
     return subprocess.call(args, shell=True, env=env, cwd=dir)