Skip to content

Commit

Permalink
Merge pull request #9 from sam0x17/import-tokens-attrs-new-args
Browse files Browse the repository at this point in the history
Allow Path or Expr evaluating to Into<String> as arg for import_tokens_proc and import_tokens_attr
  • Loading branch information
sam0x17 committed Jun 13, 2023
2 parents fcf29fc + 1a737be commit 91c0bed
Show file tree
Hide file tree
Showing 5 changed files with 240 additions and 106 deletions.
38 changes: 27 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,8 @@ them by name/path.

Among other things, the patterns introduced by `macro_magic` can be used to implement safe and
efficient exportation and importation of item tokens within the same file, and even across file
and crate boundaries.
and crate boundaries. The only requirement is that you have control over (i.e. can add an attribute
macro to) the source code of both locations.

`macro_magic` is designed to work with stable Rust, and is fully `no_std` compatible (in fact,
there is a unit test to ensure everything is `no_std` safe).
Expand Down Expand Up @@ -230,18 +231,33 @@ errors complaining about items not existing under `macro_magic::mm_core`. The
use it without enabling this feature.

The reason for this feature gating is that things like `syn`, `quote`, `proc_macro2`, etc., are
not 100% `no_std` compatible and should only be enabled in proc macro crates
not 100% `no_std` compatible and should only be enabled in proc macro crates. For this reason,
you _should not_ enable this feature in crates where you are merely using `#[export_tokens]`
and nothing else within that crate.

## Limitations

One thing that `macro_magic` _doesn't_ provide is the ability to build up state information
across multiple macro invocations, however this problem can be tackled effectively using the
[outer macro pattern](https://www.youtube.com/watch?v=aEWbZxNCH0A). There is also my
(deprecated but functional) [macro_state](https://crates.io/crates/macro_state) crate, which
relies on some incidental features of the rust compiler that could be removed in the future.

Note that the transition from 0.1.7 to 0.2.0 of `macro_magic` removed and/or re-wrote a number
of features that relied on a non-future-proof behavior of writing/reading files from the
`OUT_DIR`. Versions of `macro_magic` >= 0.2.0 are completely future-proof and safe, however
features that provided the ability to enumerate all the `#[export_tokens]` calls in a namespace
have been removed. The proper way to do this is with the outer macro pattern, mentioned above.
[outer macro pattern](https://www.youtube.com/watch?v=aEWbZxNCH0A) or in some cases using
static atomics and mutexes in your proc macro crate (which we actually do in this crate to keep
track of unique identifiers).

## Breaking Changes

- **0.4x** removed `#[use_attr]` and `#[use_proc]` (they are no longer needed with the new
self-calling macro style that has been adopted in 0.4x) and also removed the ability to
access `#[export_tokens]` invocations in inaccessible locations like inside of functions and
across module permission boundaries like in an inaccessible private module. This feature may
be re-added in the future if there is interest, however removing it allowed us to consolidate
naming of our `macro_rules!` declarations and remove the need for `#[use_attr]` /
`#[use_proc]`.
- **0.2x** removed and/or re-wrote a number of features that relied on a non-future-proof
behavior of writing/reading files in the `OUT_DIR`. Versions >= 0.2.0 are completely safe and
no longer contain this behavior, however features that provided the ability to enumerate all
the `#[export_tokens]` calls in a namespace have been removed. The proper way to do this is
with the outer macro pattern or with global state mutexes/atomics in your proc macro crate,
as mentioned above.

More detailed historical change information can be found in
[releases](https://github.com/sam0x17/docify/releases).
129 changes: 98 additions & 31 deletions core/src/lib.rs
Original file line number Diff line number Diff line change
@@ -1,21 +1,26 @@
//! This crate contains most of the internal implementation of the macros in the
//! `macro_magic_macros` crate. For the most part, the proc macros in `macro_magic_macros` just
//! call their respective `_internal` variants in this crate.
#![warn(missing_docs)]

use std::sync::atomic::{AtomicUsize, Ordering};

use derive_syn_parse::Parse;
use macro_magic_core_macros::*;
use proc_macro2::{Delimiter, Group, Punct, Spacing, Span, TokenStream as TokenStream2};
use proc_macro2::{Delimiter, Group, Punct, Spacing, Span, TokenStream as TokenStream2, TokenTree};
use quote::{format_ident, quote, ToTokens, TokenStreamExt};
use syn::{
parse::Nothing,
parse::{Nothing, ParseStream},
parse2, parse_quote,
spanned::Spanned,
token::{Brace, Comma},
Attribute, Error, FnArg, Ident, Item, ItemFn, Pat, Path, Result, Token, Visibility,
Attribute, Error, Expr, FnArg, Ident, Item, ItemFn, Pat, Path, Result, Token, Visibility,
};

/// Constant used to load the configured location for `macro_magic` that will be used in
/// generated macro code.
///
/// See also [`get_macro_magic_root`].
pub const MACRO_MAGIC_ROOT: &'static str = get_macro_magic_root!();

/// A global counter, can be used to generate a relatively unique identifier.
Expand All @@ -41,6 +46,7 @@ mod keywords {
pub struct ForwardTokensExtraArg {
#[brace]
_brace: Brace,
/// Contains the underlying [`TokenStream2`] inside the brace.
#[inside(_brace)]
pub stream: TokenStream2,
}
Expand All @@ -63,6 +69,7 @@ pub struct ForwardTokensArgs {
/// The path of the macro that will receive the forwarded tokens
pub target: Path,
_comma2: Option<Comma>,
/// Contains the override path that will be used instead of `::macro_magic`, if specified.
#[parse_if(_comma2.is_some())]
pub mm_path: Option<Path>,
_comma3: Option<Comma>,
Expand Down Expand Up @@ -97,27 +104,37 @@ pub struct ForwardedTokens {
/// You shouldn't need to use this directly.
#[derive(Parse)]
pub struct AttrItemWithExtra {
/// Contains the [`Item`] that is being imported (i.e. the item whose tokens we are
/// obtaining)
pub imported_item: Item,
_comma1: Comma,
#[brace]
_brace: Brace,
#[brace]
#[inside(_brace)]
_tokens_ident_brace: Brace,
/// A [`TokenStream2`] representing the raw tokens for the [`struct@Ident`] the generated
/// macro will use to refer to the tokens argument of the macro.
#[inside(_tokens_ident_brace)]
pub tokens_ident: TokenStream2,
#[inside(_brace)]
_comma2: Comma,
#[brace]
#[inside(_brace)]
_source_path_brace: Brace,
/// Represents the path of the item that is being imported.
#[inside(_source_path_brace)]
pub source_path: TokenStream2,
#[inside(_brace)]
_comma3: Comma,
#[brace]
#[inside(_brace)]
_custom_tokens_brace: Brace,
/// when `#[with_custom_parsing(..)]` is used, the variable `__custom_tokens` will be
/// populated in the resulting proc macro containing the raw [`TokenStream2`] for the
/// tokens before custom parsing has been applied. This allows you to make use of any extra
/// context information that may be obtained during custom parsing that you need to utilize
/// in the final macro.
#[inside(_custom_tokens_brace)]
pub custom_tokens: TokenStream2,
}
Expand All @@ -128,8 +145,11 @@ pub struct AttrItemWithExtra {
#[derive(Parse)]
pub struct ImportTokensArgs {
_let: Token![let],
/// The [`struct@Ident`] for the `tokens` variable. Usually called `tokens` but could be
/// something different, hence this variable.
pub tokens_var_ident: Ident,
_eq: Token![=],
/// The [`Path`] where the item we are importing can be found.
pub source_path: Path,
}

Expand All @@ -138,21 +158,14 @@ pub struct ImportTokensArgs {
/// You shouldn't need to use this directly.
#[derive(Parse)]
pub struct ImportedTokens {
/// Represents the [`struct@Ident`] that was used to refer to the `tokens` in the original
/// [`ImportTokensArgs`].
pub tokens_var_ident: Ident,
_comma: Comma,
/// Contains the [`Item`] that has been imported.
pub item: Item,
}

#[derive(Parse)]
pub struct BasicUseStmt {
#[call(Attribute::parse_outer)]
pub attrs: Vec<Attribute>,
pub vis: Visibility,
_use: Token![use],
pub path: Path,
_semi: Token![;],
}

/// Delineates the different types of proc macro
#[derive(Copy, Clone, Eq, PartialEq, Debug)]
pub enum ProcMacroType {
Expand Down Expand Up @@ -212,9 +225,13 @@ impl ProcMacroType {
/// }
/// ```
pub trait ForeignPath {
/// Returns the path of the foreign item whose tokens will be imported.
///
/// This is used with custom parsing. See [`ForeignPath`] for more info.
fn foreign_path(&self) -> &syn::Path;
}

/// Generically parses a proc macro definition with support for all variants.
#[derive(Clone)]
pub struct ProcMacro {
/// The underlying proc macro function definition
Expand Down Expand Up @@ -401,6 +418,10 @@ pub fn export_tokens_macro_ident(ident: &Ident) -> Ident {
Ident::new(ident_string.as_str(), Span::call_site())
}

/// Resolves to the path of the `#[export_tokens]` macro for the given item path.
///
/// If the specified [`Path`] doesn't exist or there isn't a valid `#[export_tokens]` attribute
/// on the item at that path, the returned macro path will be invalid.
pub fn export_tokens_macro_path(item_path: &Path) -> Path {
let Some(last_seg) = item_path.segments.last() else { unreachable!("must have at least one segment") };
let mut leading_segs = item_path
Expand All @@ -415,6 +436,7 @@ pub fn export_tokens_macro_path(item_path: &Path) -> Path {
parse_quote!(#(#leading_segs)::*)
}

/// Generates a new unique `#[export_tokens]` macro identifier
fn new_unique_export_tokens_ident(ident: &Ident) -> Ident {
let unique_id = COUNTER.fetch_add(1, Ordering::SeqCst);
let ident = flatten_ident(ident);
Expand Down Expand Up @@ -703,6 +725,51 @@ pub fn with_custom_parsing_internal<T1: Into<TokenStream2>, T2: Into<TokenStream
Ok(quote!(#item_fn))
}

/// Parses the (attribute) args of [`import_tokens_attr_internal`] and
/// [`import_tokens_proc_internal`], which can now evaluate to either a `Path` or an `Expr`
/// that is expected to be able to be placed in a `String::from(x)`.
enum OverridePath {
Path(Path),
Expr(Expr),
}

impl syn::parse::Parse for OverridePath {
fn parse(input: ParseStream) -> Result<Self> {
if input.is_empty() {
return Ok(OverridePath::Path(macro_magic_root()));
}
let mut remaining = TokenStream2::new();
while !input.is_empty() {
remaining.extend(input.parse::<TokenTree>()?.to_token_stream());
}
if let Ok(path) = parse2::<Path>(remaining.clone()) {
return Ok(OverridePath::Path(path));
}
match parse2::<Expr>(remaining) {
Ok(expr) => Ok(OverridePath::Expr(expr)),
Err(mut err) => {
err.combine(Error::new(
input.span(),
"Expected either a `Path` or an `Expr` that evaluates to something compatible with `Into<String>`."
));
Err(err)
}
}
}
}

impl ToTokens for OverridePath {
fn to_tokens(&self, tokens: &mut TokenStream2) {
match self {
OverridePath::Path(path) => {
let path = path.to_token_stream().to_string();
tokens.extend(quote!(#path))
}
OverridePath::Expr(expr) => tokens.extend(quote!(#expr)),
}
}
}

/// Internal implementation for the `#[import_tokens_attr]` attribute.
///
/// You shouldn't need to use this directly, but it may be useful if you wish to rebrand/rename
Expand All @@ -712,11 +779,7 @@ pub fn import_tokens_attr_internal<T1: Into<TokenStream2>, T2: Into<TokenStream2
tokens: T2,
) -> Result<TokenStream2> {
let attr = attr.into();
let mm_override_path = if attr.is_empty() {
macro_magic_root()
} else {
parse2::<Path>(attr)?
};
let mm_override_path = parse2::<OverridePath>(attr)?;
let mm_path = macro_magic_root();
let mut proc_macro = parse_proc_macro_variant(tokens, ProcMacroType::Attribute)?;

Expand Down Expand Up @@ -762,7 +825,7 @@ pub fn import_tokens_attr_internal<T1: Into<TokenStream2>, T2: Into<TokenStream2
let pound = Punct::new('#', Spacing::Alone);

// final quoted tokens
Ok(quote! {
let output = quote! {
#(#orig_attrs)
*
pub #orig_sig {
Expand Down Expand Up @@ -799,12 +862,16 @@ pub fn import_tokens_attr_internal<T1: Into<TokenStream2>, T2: Into<TokenStream2
let attached_item = attached_item.to_token_stream();
#path_resolver
let path = path.to_token_stream();
let custon_parsed = custom_parsed.to_token_stream();
let custom_parsed = custom_parsed.to_token_stream();
let resolved_mm_override_path = match syn::parse2::<syn::Path>(String::from(#mm_override_path).parse().unwrap()) {
Ok(res) => res,
Err(err) => return err.to_compile_error().into()
};
quote::quote! {
#mm_override_path::forward_tokens! {
#pound resolved_mm_override_path::forward_tokens! {
#pound path,
#orig_sig_ident,
#mm_override_path,
#pound resolved_mm_override_path,
{
{ #pound attached_item },
{ #pound path },
Expand All @@ -814,8 +881,8 @@ pub fn import_tokens_attr_internal<T1: Into<TokenStream2>, T2: Into<TokenStream2
}.into()
}
}

})
};
Ok(output)
}

/// Internal implementation for the `#[import_tokens_proc]` attribute.
Expand All @@ -827,11 +894,7 @@ pub fn import_tokens_proc_internal<T1: Into<TokenStream2>, T2: Into<TokenStream2
tokens: T2,
) -> Result<TokenStream2> {
let attr = attr.into();
let mm_override_path = if attr.is_empty() {
macro_magic_root()
} else {
parse2::<Path>(attr)?
};
let mm_override_path = parse2::<OverridePath>(attr)?;
let mm_path = macro_magic_root();
let proc_macro = parse_proc_macro_variant(tokens, ProcMacroType::Normal)?;

Expand Down Expand Up @@ -885,11 +948,15 @@ pub fn import_tokens_proc_internal<T1: Into<TokenStream2>, T2: Into<TokenStream2
Ok(path) => path,
Err(e) => return e.to_compile_error().into(),
};
let resolved_mm_override_path = match syn::parse2::<syn::Path>(String::from(#mm_override_path).parse().unwrap()) {
Ok(res) => res,
Err(err) => return err.to_compile_error().into()
};
quote::quote! {
#mm_override_path::forward_tokens! {
#pound resolved_mm_override_path::forward_tokens! {
#pound source_path,
#orig_sig_ident,
#mm_override_path
#pound resolved_mm_override_path
}
}.into()
}
Expand Down
Loading

0 comments on commit 91c0bed

Please sign in to comment.